Log in

No account? Create an account
Trevor Stone's Journal
Those who can, do. The rest hyperlink.
Secret Goals 
11th-Mar-2011 12:23 am
Vigelandsparken thinking head
One of the basic distinctions in game theory is between games with perfect information, like chess or tic-tac-toe, and games with hidden information, like poker (each player's cards and the deck) or Monopoly (the dice). But I was this evening thinking about games with not only hidden information, but hidden goals like Aquarius (hidden goal card).

Does anyone know about game theory thinking about hidden goals? In some cases (like Aquarius) it may be easy to treat it as ordinary hidden information. But in other situations (like politics), it may change the understanding quite radically. It seems like it would be very hard to develop a predictive model of a player's actions if you don't know what he's going for (e.g., somewhere in emotion-money-ideals space).
13th-Mar-2011 05:37 am (UTC)
To expand your question, Fluxx has open goals, but goals that are not currently the goal, being cards, are held as secret information.

Chrononauts has hidden goals (you draw a time-traveller dossier, essentially, that defines what your objective for the game is, and the reasoning behind it) as well as open, generic goals. IIRC, Chrononauts may also have open goals that are changeable via cards.

Goals that are changeable via cards (which are hidden information) kinda reminds me of the LISPy functions-as-data/data-as-functions way of thinking.

If so, are the Looney Labs games we're talking about in some way equivalent to the lambda calculus? Would that then imply that a strategy is not computable?
13th-Mar-2011 04:36 pm (UTC)
You don't need hidden goals to get non-computability. It could be publicly known that one player's goal is to create an infinite loop if and only if the opponent's strategy would not create an infinite loop.

Ticket to Ride is another good example of a game with hidden goals: You know everyone's trying to complete some set of routes, but you don't know exactly which. At least, until they curse at you for taking the Calgary to Winnipeg leg before they can. And this is sort of a code-is-data thing: your goal (code) is data (a card in the deck).
13th-Mar-2011 04:45 pm (UTC)
You don't need hidden goals to get non-computability. It could be publicly known that one player's goal is to create an infinite loop if and only if the opponent's strategy would not create an infinite loop.

Well, within the standard cards, I don't think those goals exist, but I totally support that. I bet I've got some blanks here somewhere....
13th-Mar-2011 04:47 pm (UTC)
Oh, not in Chrononauts. I was assuming the existence of a different game; perhaps one where you can make up any goal you want. Calvinball?
13th-Mar-2011 08:28 am (UTC)
If you don't know someone's goals, then the only way to predict their actions is by inductive rather than deductive reasoning, right? Which is indeed slower and likely more error-prone.
13th-Mar-2011 04:30 pm (UTC)
I was thinking of a situation where you might know what sorts of goals someone might have, but not in any detail. rubicantoto's example of Chrononauts is good: you know their goal involves collecting a particular set of three cards, but you don't know which three. And as the game goes on, you may be able to guess based on their actions thus far.

In a business or political situation, I was thinking of something like multi-variable bargaining where each side doesn't disclose the relative importance of each variable.
13th-Mar-2011 04:55 pm (UTC)
Or (to sidetrack a bit) in a personal situation, like dating.

I'm being somewhat cynical here, but also sort of serious. I think the vast majority of people have some kind of calculus for determining when to drop bombs of various types, but not until the n'th date. It's common advice on Savage Love, for example, to bring up most fringe kinks later and later.

In general, this is a very practical treatment of it as "ordinary hidden information". The most-specific/lowest-weighted behavior-affecting variables are the last ones you need to track to build an accurate model.
13th-Mar-2011 10:35 pm (UTC)
Huh. That last actually reminds me of Deborah Tannen's The Argument Culture, which I've just been reading -- specifically the part about how, especially (perhaps) in this country, we've developed this prevailing notion that in business or legal negotiations you have to demand more than you really want just to have any hope of getting anything, and that just makes it much harder for people to come to a mutually satisfactory arrangement even when one does theoretically exist.
This page was loaded Mar 23rd 2018, 3:29 am GMT.