Bots just wanna chat
Published by Christy Dena May 28th, 2005 in Uncategorized, HAI.News from ARGN that an ARG (Alternate Reality Game), Jamie Kane, is currently being beta-tested. The game starts with the death of a popstar. The gameplay involves players interacting with characters through a forum, emails, phone and messenger. The character conversations through messenger are AI. I’ve signed up for the beta-test and have had a chat with a character named Jess. Here are some of my preliminary observations of the chat.
The conversation started off really well but then there were some weird answers. The character kept asking me questions about what I like, TV shows, movies and so on. This fits with the target demographic, teenagers, but I also felt that real conversations aren’t that one-sided. Who wants to know heaps about the other person and not be asked anything themselves? The questions, I assumed, were also an information gathering device.
I was also getting annoyed about the innane chat because I really wanted to get down to business — to figure out why the popstar, Jamie Kane, was dead. Jess just seemed to want to chat and gush over Jamie. I guess this matches the players — they would probably be interested in having a casual chat more than anything else. I also recognised that Jess got upset with me once when I asked again about Jamie. I was also warned that if I didn’t like Jamie she’d kick me off the forum. So, this is a indication of the boundaries, the rules of play. As you can see with the mini transcript below, I tried winning Jess over by playing a fan. I did this because I thought that would be the only way I could get info. But Jess ended the conversation and apparently we will catch up again tomorrow.
JESSICA> Aww, how cute! That’s cool. I was always driving my parentals mad by playing loud music - especially Jamie :D You into anyone else apart from Jamie K?
CHRISTY> No. What happened with Jamie?
JESSICA> Jamie Kane is the best by miles *swoon*! Anyway, Christy what did you like best about him?
CHRISTY> His cute smile.
JESSICA> Well, you know how I feel about him. Looks like we’ve got another massive fan here! Cool…
On the face of it, it seems that the messenger conversations will play a social function more than a plot one. With the other modes of forum and email playing a larger role due to control.
Sign up at the BBCi Jamie Kane beta site and let me know your observations on the bot techniques they use.
I wonder if the “social function” is just another channel to add richness to the game world, or if instead the “data collection” you suspect might relate in some way to demographic analysis and commercial tie-ins. Was there a terms-of-service you had to sign off on when you registered that talked specifically about direct marketing or what they do with your chat logs?
A murder mystery is a solid story for group investigation - and investigation seems to be the preferred activity of ARGs. Still, murder limits the aftermarket. A really canny marketing firm would probably turn the story into an abduction, and then create a spinoff personality if the game goes well - celebrity creation a la Blues Brothers or The Monkees or Spinal Tap. The music style might still be partly unspecified - the audience for the game might create their star in the process of investigating him. Emergent pop stars.
The ’social function’ definately added to the experience, but I’ll have to decide after more chats whether it is a primary or satellite factor. One of the creators, Rob Cooper in the BBC’s Interactive Drama and Entertainment Dept, comments in a forum that the game is intended for teenage girls. The ‘ storytelling and social aspect of ARGs give them a depth that we think think girls will really enjoy’. Cooper also adds that in their experience teenage girls are alot more forgiving of chatbot dialogue errors:
As for the demographic data collection. There was this little term-of-reference in the description of the game:
Legal bit: The BBC will only use the information you give us to personalise the game. We won’t give your details to anyone else and we’ll delete them when you finish the game or unsubscribe.
There is also, however, a FAQ page about the data requested. The data collected is on three levels:
The first is just for BBC membership and can be changed and removed once the game is over. The third is a device that is part of the gameplay (you can speak to characters). The second, the personal data that the bot Jess was trying to get out of me, is described as follows:
So, in collecting more info about the user and being able to repeat it back, the bot is said to appear more real. This technique of building a user profile is quite popular and I must admit I like it when a bot ‘remembers me’. Logically though, I’m not thinking that the bot is human or more real…I’m liking the idea of that the designers are taking time to make details about me an important part of gameplay. As if this proves that I am factored into the game design. That I play a role (how can the bot talk me without my details?). Jeremy, I think this relates to your ‘implied code’ concept…
The repeating back of personal details also creates a sense of continuity: that the combination of me and the game is following a path together. I’m leaving a trial as well, crumbs of myself in every session. And when these details are referred to they are a textual weaving of myself and the creator’s story.
As I said, this technique has been used alot and is often played with by users. For instance, in the ‘interactive drama’ Jupiter Green, I had to fill out a profile of myself at the beginning of the ‘game/story’. The characters would then email me with details accordingly. So, as part of my research, I signed up as a male and a female and with different likes and dislikes. I was treated pretty much the same, regardless of gender, but the other stuff changed.
And yes, I agree that a ‘death’ is a less motivating plot point. But I’m trying to find out about the death, presuming the pop star isn’t really dead. The more I try and find out details and the more the characters avoid this question the more I think I’m onto something. But it is early days. Obviously, trying to find out the murderer is of primary importance.
FYI: The game is meant to be played for about 20mins everyday for 15 days. It is not realtime, so anyone can join in at any time (unlike most ARGs). I should also add that beta-testers are asked to use the Alternate Reality Gaming Network’s special forum on the game.
Some further observations:
Had a chat with Greta (a bot) and that was a good dialogue. It was short and enjoyable. The Greta character was just full of praise at how cool I was and how she wanted my help. This put me into thinking about myself rather than the bot and the problems with the bot.
She drove the conversation. I only had to respond, as myself but in ‘character’ (an important note: that our own motivations as people must be able to be expressed through the in-game role we’ve been given).
But what I also found interesting was something I haven’t seen before: a cross-program reaction. The bot gave me a link to her blog which then supplied me with an online computer program to do some image manipulation. Once the manipulation was done, and I had solved a mystery!, the program immediately emailed the pic to the Greta character. I went back to the emessenger window (which Greta asked me to keep open, so I could tell her immediately what I had discovered). I told Greta, the bot, that I had sent her the email and she went of ‘to check’. She ‘came back’ and confirmed she had received it. Then she posted it to the forum, thanking me. I thought this was a really well done integration of different systems, talking to the bot engine. To me, this is what bots are all about: having real-time responses to things.
The rush to get stuff done was good, distracting me from analysing the bot design. Just like a bad actor allows one’s mind to wander onto their faults, a bad bot does the same. I think the action, the immediacy of the task and the effort needed on behalf were big points towards the good actor/bot scenario.
Here is some of the dialogue so you can see just how smoothly it went:
Another item I like is the stalling, the time between my entry in the emessenger box and the bot’s response. This is a tactic many bot programmers try to use to make the interaction seem more real.
More coming soon.
This looks like fun. I try to use the “memory/profile” trick in Barthes Bachelorette and feel it nicely fulfills a narcissistic side to bot usage.
I’m becoming more and more aware of the role of context in producing the chatbot. If we compare: Loebner Prize to Smarter Child to Greta to others we’ve mentioned. It’s just as important as the replies themselves in situating how we interpret any given bot. If we made a list of key features of bots, I would thing “context” would be more constitutive than descriptive or modifying.
Can you explain this again,
What I mean is, it was easy for me to converse — to quickly respond to anything because I didn’t have to think about the attitude of my character in the game. If I had taken on a character that wasn’t in-line with how I naturally think then the dialogue would not run smoothly I think. Bot conversations are quick, and so we’re running on automatic (on how we naturally react) just as much as the bots!
I guess an example of a plyaer role not aligning with my natural self would be if I had to play the killer or kidnapper. I don’t have that sort of knowledge at hand. Also if I had to play the forum owner. Thinking about it now — what I’m saying is that the character of the player must by default have limited knowledge. In terms of motivations — my role must be one of wanting to help, to do things…otherwise I have to play a background character…
Update on wikipedia scandal blogged 28th August.