Saturday, October 18, 2008
What does a dog say?
I'm sure you all remember when you were little kids and your parents asked you a whole series of questions like "what does a dog say?", "what does a duck say?", etc. Well, Agathe Jacquillat and Tomi Vollauschek, graphic designers who met at the Royal College of Art in London while taking a post-grad course on Communication Art and Design, have taken that childhood game a step farther, possibly in another direction.
Jacquillat and Vollauschek are responsible for the delightfully intriguing and addictive site bzzzpeek , where they have collected voice samples of children from around the world responding to questions like "what does a dog say?". The best thing about this is that their responses are only occasionally "woof woof". Russian dogs say "guff guff", Japanese dogs say "wua wua", and goodness knows what Korean dogs say. Korean onomatopoeia tends to be the strangest for American English listeners.
All this serves to bring home a point that I haven't made yet, one that is often overlooked in Cognitive Science; language is not just for communicating, it's also for perceiving. It's fair to assume that Russian dogs bark like American dogs (they have the same vocal apparatus), so it must not be that American dogs make "woof woof" sounds any more than Russian dogs make "guff guff" sounds. Rather, where American speakers hear "woof woof", Russian speakers hear "guff guff". The language(s) we speak influence what we are able to hear (and communicate). For more on this topic, Sapir-Whorf Hypothesis and Karen Mattock.
I really appreciate this sort of work because it marries principles of good design (simple intuitive complexity) with novel, unpretentious human-sized science.
Wednesday, October 8, 2008
a closer look
Eyes are important. Not only are they the primary portal through which we perceive the world (~30% of cortex is involved in vision), the eyes can reveal the interests and intentions of others. It's no mistake that the eye has been referred to as "the window to the soul" since Biblical times.
The other day my professor made the point that eye-contact is not a property of the individual. Eye-contact emerges from interactions between individuals. A little bit of close observation reveals that eye-contact is not just a social phenomenon; it's also a powerful social tool.
Have you ever tried to catch someones eye? Or how about in movies, when lovers/enemies lock eyes before they kiss/fight? Or when a tour guide advises you "not to make eye-contact" with vendors in a foreign country?
When someone is lying they are often shifty-eyed and we look askance at them. According to Dictionary.com, to 'look askance' at someone is to "disapprove", while 'askance' simply means "sideways or obliquely". We show our disapproval by withdrawing access to eye-contact.
Eye-contact in humans is fun, but it has also proved to be an interesting tool for analyzing the interactions of non-human primates. Dr. Christine Johnson (UCSD) has been studying a triad (group of three) of bonobos (monkeys, above.) at the San Diego Zoo. They don't talk and there is no way to 'look inside' of their heads to see what their thinking, but they are obviously social and cognitive. So Dr. Johnson decided to code her data for "brightness", the degree to which each bonobo is facing the others (i.e. access to eye-contact). Her data reveal that access to eye-contact ("brightness") in the group is reliably correlated with patterns in their social interactions.
While I'm not really doing justice to the topic of eye-contact, or even to Dr. Johnsons research, for that matter, I hope you see what I mean.
Tuesday, October 7, 2008
Define "You"
I've been working in a lab that is interested in deixis, among other things. Deixis is, essentially, context-dependent linguistic reference. So any word that would be entirely ambiguous out of context, like 'now', 'then', 'this', 'that', 'it', 'there', 'me' and 'you' are all deictics.
These words (or rather, how people use these words) are interesting because
PS - Thanks for the cartoon, Dr. Creel!
These words (or rather, how people use these words) are interesting because
a) the speaker must take a personal perspective to use them (if you are 'there', I must be 'here'), and
b) they are used to talk about people, places, and things (both present and absent, concrete and abstract) in terms of how the speaker conceives of them,
c) they are unintelligible to non-humans (though several animal species, notably chimps, are capable of recognizing themselves, abstract symbolic reference is out of their reach), as illustrated below.
PS - Thanks for the cartoon, Dr. Creel!
Monday, September 29, 2008
What YouTube Means About YOU
This quarter I am taking my second ethnography class. I'm interested in using ethnographic method for my own research (both now and in the future). According to Wikipedia, ethnography is a
While ethnography has traditionally been a tool for anthropologists, both anthropology and ethnography are now being used to look at cognition and cognitive systems (yay!).
Here's a really brilliant presentation, given by Michael Wesch to the Library of Congress, showing what ethnography can reveal about the cultural and social properties of YouTube.
"genre of writing that uses fieldwork to provide a descriptive study of human societies. Ethnography presents the results of a holistic research method founded on the idea that a system's properties cannot necessarily be accurately understood independently of each other."In practice, ethnography is learning how to look at, talk about and transcribe data (in this case, pictures, audio, and video) in ways that reveal patterns, regularities, causalities, etc. that tell us something about the system.
While ethnography has traditionally been a tool for anthropologists, both anthropology and ethnography are now being used to look at cognition and cognitive systems (yay!).
Here's a really brilliant presentation, given by Michael Wesch to the Library of Congress, showing what ethnography can reveal about the cultural and social properties of YouTube.
Friday, April 25, 2008
"Two by Two" or "Why One is Not Enough"
A lot of Cog Sci research is focused on figuring out the basic mechanisms of sensation and perception, mostly because we need to understand the basics before we can make real claims about more abstract, interesting, human-type cognitive activities. This is all just to preface the next bit.
As far as I can tell, our bodies know that 'two are better than one', especially when it comes to perceiving and navigating through space. Think about it. At an obvious and general level, we have two feet to walk and two hands to manipulate things. On a more cognitive neuroscience-y level, we have two eyes that see slightly different views of the world (allowing us to perceive depth more easily), and we have two ears that hear two different sound profiles of the world (allowing us to locate the sources of sounds). This all makes sense.
Here's where things get strange. We have a nose. We have two nostrils. Recently, researchers at Berkeley found that this is a big deal (in terms of spatial perception). Despite the fact that humans don't depend on olfaction (smell) in the same way that other mammals do (see dogs, cats, rats, etc), these researchers found that humans are perfectly capable of following a scent trail (see image #1, one the left is a dog following a pheasants scent trail; on the right is one of the subjects in the study).
Not only that, but they used a 'nose prism' (row f in image #2, don't ask) that allowed the researchers to control whether the participants were breathing air from one or two airstreams. Turns out that participants breathing two different airstreams (which means that each nostril got slightly different air/scent inputs) were both faster and more accurate in their scent-tracking than those who only smelt air from a single stream! I think that it is crazy that we (our brains) are capable of detecting differences between our nostrils (that's like 5mm, nothing!).
As far as I can tell, our bodies know that 'two are better than one', especially when it comes to perceiving and navigating through space. Think about it. At an obvious and general level, we have two feet to walk and two hands to manipulate things. On a more cognitive neuroscience-y level, we have two eyes that see slightly different views of the world (allowing us to perceive depth more easily), and we have two ears that hear two different sound profiles of the world (allowing us to locate the sources of sounds). This all makes sense.
Here's where things get strange. We have a nose. We have two nostrils. Recently, researchers at Berkeley found that this is a big deal (in terms of spatial perception). Despite the fact that humans don't depend on olfaction (smell) in the same way that other mammals do (see dogs, cats, rats, etc), these researchers found that humans are perfectly capable of following a scent trail (see image #1, one the left is a dog following a pheasants scent trail; on the right is one of the subjects in the study).
Not only that, but they used a 'nose prism' (row f in image #2, don't ask) that allowed the researchers to control whether the participants were breathing air from one or two airstreams. Turns out that participants breathing two different airstreams (which means that each nostril got slightly different air/scent inputs) were both faster and more accurate in their scent-tracking than those who only smelt air from a single stream! I think that it is crazy that we (our brains) are capable of detecting differences between our nostrils (that's like 5mm, nothing!).
Wednesday, April 23, 2008
In a Nutshell
Boys and Breakfast
Recently I've been going to the BBC for my news. Today one of the headlines for their "Health" section was "High-calorie diet linked to boys".
According to the article, women who have higher-calorie, nutrient-rich diets around the time of conception and early pregnancy tend (56% of 740 first-time pregnancies studied) to have male children. This trend has been well-documented in other species (horses, cows, etc). There has been a slow, steady decline in the number of boy babies born in developing countries in the few decades.
The idea is that well-nourished mothers are more likely to live in favorable environments, environments that could support a whole bunch of babies. It makes sense, under these conditions, to have boy babies because boys could sire more babies than girls could mother...
The point of all this (for me) is that it is tremendously ironic that the cultural values and habits of the affluent (thin, well-managed women, preferably too busy to eat breakfast) would create conditions of scarcity within the individual (lean times, more girl-children).
Sorry to stray from cog sci, but intriguing things are intriguing things, and both nutrition and gender have interesting and significant effects on cognition, so this might relate to something more relevant later.
Also, for full article, go to http://news.bbc.co.uk/2/hi/health/7358384.stm .
According to the article, women who have higher-calorie, nutrient-rich diets around the time of conception and early pregnancy tend (56% of 740 first-time pregnancies studied) to have male children. This trend has been well-documented in other species (horses, cows, etc). There has been a slow, steady decline in the number of boy babies born in developing countries in the few decades.
The idea is that well-nourished mothers are more likely to live in favorable environments, environments that could support a whole bunch of babies. It makes sense, under these conditions, to have boy babies because boys could sire more babies than girls could mother...
The point of all this (for me) is that it is tremendously ironic that the cultural values and habits of the affluent (thin, well-managed women, preferably too busy to eat breakfast) would create conditions of scarcity within the individual (lean times, more girl-children).
Sorry to stray from cog sci, but intriguing things are intriguing things, and both nutrition and gender have interesting and significant effects on cognition, so this might relate to something more relevant later.
Also, for full article, go to http://news.bbc.co.uk/2/hi/health/7358384.stm .
Wednesday, April 2, 2008
Visualizing the Inside Out
A design group from Hong Kong has come up with a clean and clever way to help the general public visualize the impact of the(nearly) invisible. These guys printed a poster (with a drawing of our respiratory system) with clear sticky ink. Over time, the image (of our lungs) became visible as airborne pollutants collected on the poster. This is definitely a great example of meaningful yet parsimonious design.
Thursday, March 20, 2008
BrainRise
Monday, January 28, 2008
Oddballs and the Unexpected
There are many ways to see what the brain is doing during a task. One of the most common techniques is called electroencephalography (EEG), which measures the electrical impulses of the brain collected using electrodes on the scalp. Brain researchers have come to recognize that there are certain brainwaves that characterize responses to a specific class of stimuli. The brainwave that I want to talk about is called the N400.
The N400 appears when a person is presented with a semantically unfitting or 'oddball' (yes, that is a scientific term) sentence. For example, "I spread my toast with jam and socks."
Recently, researchers have been arguing about whether or not people integrate context information when processing the meaning of sentences. By arguing, I mean passionately writing papers and designing definitive experiments.
So this Dutch scientist, van Burken, thought "Well, if conext information IS integrated in language processing, the 'oddball' effect (the N400) should happen when a perfectly acceptable sentence is presented in an inappropriate context." So he presented the same sentence ("I would like a glass of wine.") twice; once in an adult voice, and once in a childs voice. Sure enough, viewers showed an N400 response to an entirely acceptable sentence in an inappropriate context!
I love elegantly designed experiments like this one because even though we intuitively understand something (like that context is a factor when we are interpreting meaning), conclusive scientific evidence for it can difficult to come by.
The N400 appears when a person is presented with a semantically unfitting or 'oddball' (yes, that is a scientific term) sentence. For example, "I spread my toast with jam and socks."
Recently, researchers have been arguing about whether or not people integrate context information when processing the meaning of sentences. By arguing, I mean passionately writing papers and designing definitive experiments.
So this Dutch scientist, van Burken, thought "Well, if conext information IS integrated in language processing, the 'oddball' effect (the N400) should happen when a perfectly acceptable sentence is presented in an inappropriate context." So he presented the same sentence ("I would like a glass of wine.") twice; once in an adult voice, and once in a childs voice. Sure enough, viewers showed an N400 response to an entirely acceptable sentence in an inappropriate context!
I love elegantly designed experiments like this one because even though we intuitively understand something (like that context is a factor when we are interpreting meaning), conclusive scientific evidence for it can difficult to come by.
Saturday, January 19, 2008
Music Make Me Lose Control
My father was scanning the headlines and he came across a news article about a woman who had MUSIC-INDUCED epilepsy! They call it musicogenic epilepsy, and according to the hospital that treated her, she is one of 5 such cases in the world today.
Epileptics of this sort experience seizures only while hearing music. For this patient, Stacy Gayle, singing in her church choir and listening to music by Sean Paul sent her into grand mal seizures.
Ms. Gayle found that medication didn't really help with her seizures, so she went to Long Island Jewish Medical Center for treatment. Doctors there determined that her seizures came from a single, abnormal region of her right hemisphere. They recorded the electrical activity of her brain (using EEG) and when they saw that she was going into a seizure, they injected her with a radioactive tracer and performed a PET scan, which revealed that her seizures started in a part of her temporal lobe (the medial temporal lobe). To further pinpoint the abnormal region, they implanted a set of 100 electrodes in the right side of her brain, targeting the medial temporal lobe. Once these electrodes had recorded her seizure doctors were able to remove the exact epicenter of her epilepsy (without giving her any neurological deficits!). She has not had a seizure in the 3.5 months since the operation.
For the full article, visit
http://www.northshorelij.com/body.cfm?id=15&action=detail&ref=996
Epileptics of this sort experience seizures only while hearing music. For this patient, Stacy Gayle, singing in her church choir and listening to music by Sean Paul sent her into grand mal seizures.
Ms. Gayle found that medication didn't really help with her seizures, so she went to Long Island Jewish Medical Center for treatment. Doctors there determined that her seizures came from a single, abnormal region of her right hemisphere. They recorded the electrical activity of her brain (using EEG) and when they saw that she was going into a seizure, they injected her with a radioactive tracer and performed a PET scan, which revealed that her seizures started in a part of her temporal lobe (the medial temporal lobe). To further pinpoint the abnormal region, they implanted a set of 100 electrodes in the right side of her brain, targeting the medial temporal lobe. Once these electrodes had recorded her seizure doctors were able to remove the exact epicenter of her epilepsy (without giving her any neurological deficits!). She has not had a seizure in the 3.5 months since the operation.
For the full article, visit
http://www.northshorelij.com/body.cfm?id=15&action=detail&ref=996
Friday, January 11, 2008
In the Mood for Memory
It's January and I just started a new quarter at UCSD, so I'm going to make a fresh attempt to make a habit of posting when I learn something especially interesting. Maybe, if this works, I'll even do a "Coolest Cog Sci Fact of the Week" sort of thing or something.
So, to begin with, I'm taking a class called "Learning, Memory and Attention." In her first lecture Dr. Sarah Creel (my professor) told us the most interesting thing about memory. She was giving us a few scientifically informed study tips, like study a little bit at a time and be sure that you understand the concepts well enough to explain them to your grandmother and caffeine and exercise help consolidate memory, etc., when she mentioned that it is important to study in the same state of mind that you will be in when you take the test. In other words, memory is functionally dependent on your brain state.
Here's her illustration (note: stories and concrete examples are amazing ways to make a concept memorable). A friend of hers, lets call her Carly, was an undergrad at Berkeley taking calculus. Carly was also on crack. When she studied she was on crack, when she went to class she was on crack, when she took tests she was on crack. Carly got an A in calculus. At this point Dr. Creel made sure to disclaim that "This isn't a drug endorsement, and crack doesn't make you smarter." Over the summer Carly got clean and when she came back in the fall she took the next calculus class in the series. Carlys new teacher gave her a test to see how much of the material from the previous class she had retained. She got an F. The sober Carly couldn't remember the things that the drugged Carly had learned. So, Dr. Creel said, the moral of the story is that you should make it easy on yourself by studying AND testing sober.
Great story, but I am much more interested in the implications this has on the nature of my own identity than improving my memory. Think about it. This memory principle applies to our brain states in general (i.e. our overall mood and neurochemical activity) rather than specifically just drug-induced brain states. In practice this means that, for example, when I am depressed I most easily remember episodes and information that I encountered during past periods of depression (by depression I mean a mood, not clinical depression). We've all experienced how our moods seem to feed themselves, but just think about it in terms of identity. Identity is essentially composed from a series of key memories about the experiences that we have had and what we have made them mean about the world. If I am building a definition of myself (to a certain degree) from my own memories, then my understanding of who I am when I am depressed is significantly different from who I think I am when I am happy or calm simply because I am constructing my identity from a different set of memories.
Now, I'm definitely no expert, and I haven't done any experiments to investigate this further (yet), but I have a few theories (or rather, informed intuitions) about the nature of self and identity. There are a ton of real-life illustrations of how a 'person' can behave as though he/she is actually a series of distinctly different people (generally speaking, in terms of personality traits and behavior). Obvious examples are people with Multiple-Personalities Disorder and people who are bipolar, but the same principle can be seen in perfectly functional, well-adjusted, 'normal' people. For example, a woman uses significantly different behaviors and cognitive strategies (she assumes an entirely different role) when she is interacting with her child than she does when she is interacting with her husband. Our culture and our immediate context (our social role and mood)help us filter out memories that aren't appropriate or are inconsistent (in terms of our social context). I definitely need to do a whole lot more thinking (and maybe some research) about this, but if any of you have any thoughts on the subject, please do share them.
So, to begin with, I'm taking a class called "Learning, Memory and Attention." In her first lecture Dr. Sarah Creel (my professor) told us the most interesting thing about memory. She was giving us a few scientifically informed study tips, like study a little bit at a time and be sure that you understand the concepts well enough to explain them to your grandmother and caffeine and exercise help consolidate memory, etc., when she mentioned that it is important to study in the same state of mind that you will be in when you take the test. In other words, memory is functionally dependent on your brain state.
Here's her illustration (note: stories and concrete examples are amazing ways to make a concept memorable). A friend of hers, lets call her Carly, was an undergrad at Berkeley taking calculus. Carly was also on crack. When she studied she was on crack, when she went to class she was on crack, when she took tests she was on crack. Carly got an A in calculus. At this point Dr. Creel made sure to disclaim that "This isn't a drug endorsement, and crack doesn't make you smarter." Over the summer Carly got clean and when she came back in the fall she took the next calculus class in the series. Carlys new teacher gave her a test to see how much of the material from the previous class she had retained. She got an F. The sober Carly couldn't remember the things that the drugged Carly had learned. So, Dr. Creel said, the moral of the story is that you should make it easy on yourself by studying AND testing sober.
Great story, but I am much more interested in the implications this has on the nature of my own identity than improving my memory. Think about it. This memory principle applies to our brain states in general (i.e. our overall mood and neurochemical activity) rather than specifically just drug-induced brain states. In practice this means that, for example, when I am depressed I most easily remember episodes and information that I encountered during past periods of depression (by depression I mean a mood, not clinical depression). We've all experienced how our moods seem to feed themselves, but just think about it in terms of identity. Identity is essentially composed from a series of key memories about the experiences that we have had and what we have made them mean about the world. If I am building a definition of myself (to a certain degree) from my own memories, then my understanding of who I am when I am depressed is significantly different from who I think I am when I am happy or calm simply because I am constructing my identity from a different set of memories.
Now, I'm definitely no expert, and I haven't done any experiments to investigate this further (yet), but I have a few theories (or rather, informed intuitions) about the nature of self and identity. There are a ton of real-life illustrations of how a 'person' can behave as though he/she is actually a series of distinctly different people (generally speaking, in terms of personality traits and behavior). Obvious examples are people with Multiple-Personalities Disorder and people who are bipolar, but the same principle can be seen in perfectly functional, well-adjusted, 'normal' people. For example, a woman uses significantly different behaviors and cognitive strategies (she assumes an entirely different role) when she is interacting with her child than she does when she is interacting with her husband. Our culture and our immediate context (our social role and mood)help us filter out memories that aren't appropriate or are inconsistent (in terms of our social context). I definitely need to do a whole lot more thinking (and maybe some research) about this, but if any of you have any thoughts on the subject, please do share them.
Subscribe to:
Posts (Atom)