I'd wear a computer on my wrist would you?

We may soon be wearing our electronic gadgets—from heart monitors to cell phones—as a second skin. Researchers have discovered a way to create circuits so thin and flexible that they can be applied like temporary tattoos.

“All established forms of electronics are hard, rigid,” study author Yonggang Huang, an engineer at Northwestern University, tells ScienceDaily.com. But by using wires thinner than a hair and mounting them in flexible sheets of silicon and rubber, he and his colleagues were able to make digital patches that are as soft and elastic as human skin. The circuits, called epidermal electronic systems, can be rubbed on with water instead of needing tape or glue to attach. And they’re small enough to be recharged with solar power.

Researchers say the technology will be nearly invisible to wearers and could be used instead of bulky machines to record medical patients’ vital signs. The paste-on computers will also let us interact with video games and MP3 players using muscle or voice commands. “Ultimately,” says co-author John Rodgers, they will “blur the distinction between electronics and biology.”

Would you wear one on your wrist?

- As seen in The Week
Brought to you by NetLingo: Improve Your Internet IQ



Evidence Suggests that the Internet Changes How We Remember

MIT’s Technology Review’s Kenrick Vezina writes on a recent study that says we augment our memory with the Internet. The flood of information available online with just a few clicks and finger-taps may be subtly changing the way we retain information, according to a new study. But this doesn't mean we're becoming less mentally agile or thoughtful, say the researchers involved. Instead, the change can be seen as a natural extension of the way we already rely upon social memory aids—like a friend who knows a particular subject inside out.

Researchers and writers have debated over how our growing reliance on Internet-connected computers may be changing our mental faculties. The constant assault of tweets and YouTube videos, the argument goes, might be making us more distracted and less thoughtful—in short, dumber. However, there is little empirical evidence of the Internet's effects, particularly on memory.

Betsy Sparrow, assistant professor of psychology at Columbia University and lead author of the new study, put college students through a series of four experiments to explore this question.

One experiment involved participants reading and then typing out a series of statements, like "Rubber bands last longer when refrigerated," on a computer. Half of the participants were told that their statements would be saved, and the other half were told they would be erased. Additionally, half of the people in each group were explicitly told to remember the statements they typed, while the other half were not. Participants who believed the statements would be erased were better at recalling them, regardless of whether they were told to remember them.

Another experiment had subjects again typing predetermined statements into a computer, but this time, some were told that their statements would be saved in a specific folder on that machine. Participants were better at remembering the names of the folders a statement was stored in than they were at remembering the statements themselves.

The experiments suggest that we are less likely to remember facts when we know they can be easily looked up online, the researchers say. This conclusion is an extension of an idea proposed some 30 years ago by Sparrow's mentor (and a coauthor of a paper describing the latest work), Daniel Wegner, of Harvard's psychology department.

Wegner proposed the idea of "transactive memory" as a collective social memory of sorts. For example, if a friend has an exhaustive knowledge of Greek history, you can simply remember that The Iliad is Greek and that your friend knows about Greek things, rather than remembering who wrote the epic poem. Sparrow and Wegner say that the Internet may serve a similar function, acting as an extension of this external memory.

Mary C. Potter, professor of psychology in MIT's Department of Brain and Cognitive Sciences, says the study supports the commonsense idea that we use external tools to remember information. She notes, however, that many of the results are at the threshold for statistical significance, and says the study should be seen as suggestive rather than conclusive.

Potter also wonders if the results may be due to sociological rather than psychological phenomena. When your friend whips out his smart phone to look up information about a band, this could be "because it's fun" rather than being about changes to how our brains store information, she says.

Nicholas Carr has been one of the leading voices in the debate. His book The Shallows, published in June, contends that the Internet is having a detrimental effect, an argument he supports with numerous scientific studies. He says Sparrow's study "indicates how flexible our brains are in adapting to our tools."

However, he's not convinced that this adaptation is positive. "It's critically important to remember that there's a difference between external memory and internal memory," he says. "If you're not internalizing ... then your understanding becomes less personal, less distinctive, and, I think, ultimately more superficial."

Sparrow, on the other hand, sees this adaptation as positive. She says our minds are molding to the Internet, just as they have in the past with technologies like the written word.

She's now trying to probe the benefits of this external memory with more experiments. Imagine a history student reading a dense passage, full of dates and names, about the American Revolution. Perhaps if the student is confident that the details will be available on the Internet, he will be better able to get a larger sense of why the revolution happened. Her intuition is that when we expect the details to be available later, we're better at looking for larger messages that might be obscured if we were preoccupied with minutiae.

- As seen in Technology Review

Pentagon Wants a Social Media Propaganda Machine

You don’t need to have 5,000 friends of Facebook to know that social media can have a notorious mix of rumor, gossip and just plain disinformation. The Pentagon is looking to build a tool to sniff out social media propaganda campaigns and spit some counter-spin right back at it, according to Adam Rawnsley in Wired.

Defense Department extreme technology arm DARPA unveiled its Social Media in Strategic Communication (SMISC) program. It’s an attempt to get better at both detecting and conducting propaganda campaigns on social media. SMISC has two goals. First, the program needs to help the military better understand what’s going on in social media in real time — particularly in areas where troops are deployed. Second, Darpa wants SMISC to help the military play the social media propaganda game itself.

Not all memes, of course. Darpa’s not looking to track the latest twists on foul bachelor frog or see if the Taliban is making propaganda versions of courage wolf. Instead, it wants to see what ideas are bubbling up in among social media users in a particular area — say, where American troops are deployed.

More specifically, SMISC needs to be able to seek out “persuasion campaign structures and influence operations” developing across the social sphere. SMISC is supposed to quickly flag rumors and emerging themes on social media, figure out who’s behind it and what. Moreover, Darpa wants SMISC to be able to actually figure out whether this is a random product of the hivemind or a propaganda operation by an adversary nation or group.

Of course, SMISC won’t be content to just to hang back and monitor social media trends in strategic locations. It’s about building a better spin machine for Uncle Sam, too. Once SMISC’s latches on to an influence operation being launched, it’s supposed to help out in “countermessaging.”

Darpa’s announcement talks about using SMISC “the environment in which the military operates” and where it “conducts operations.” That strongly implies it’s intended for use in sensing and messaging to foreign social media. It better, lest it run afoul of the law. The Smith-Mundt Act makes pointing propaganda campaigns at domestic audiences illegal.

What exactly SMISC will look like it its final form is hard to say. At the moment, Darpa is only in the very beginning stages of researching its social media tool. They’re focused on researching the brains of the program — the algorithms and software that’ll identify, locate and make sense of social media trends.

For that, they need some social media data to play around with and test on. Darpa wants bidders to create it in one of two ways. Bidders can round up a few thousand test subjects willing to let their social media data be a guinea pig for SMISC’s software. Alternatively, they can rope in some consenting test subjects for a massively multiplayer role playing game in which generating social media data is a key part of gameplay.

SMISC is yet another example of how the military is becoming very interested in what’s going on in the social media sphere. Darpa has plans to integrate social media data into its manhunt master controller, Insight. NATO has already been paying keen attention to Twitter, using data from the micro-blogging service as an intel source to aid in bomb targeting decisions.

Darpa’s presolicitation offers a very vaguely-sourced anecdote spelling out how SMISC could be used. It details how a social media rumor about the location of a particularly reviled individual — identity and location undisclosed — almost led a lynch mob to storm a house in search of him. Authorities who happened to be paying attention to the Internet rumor were fortunate enough to spot it in time to intervene. In this telling of SMISC’s potential applications, the software could be used to as a tripwire to stop potentially dangerous social media campaigns in their tracks. But we’re sure you — and the Pentagon — can think of a lot less anodyne uses for Darpa’s social media propaganda tool.

- As seen in Wired

Army Seeks Social Media Gurus to Save Afghan War

Know how to tweet? Or how to put words into the mouths of foreign security functionaries? If so, the U.S. Army wants you to help un-quagmire the Afghanistan war. In honor of the 10th anniversary of 9/11, here's one way you can help and get out of the bleak job market.

A new solicitation from the Army seeks communications experts to run the full spectrum of outreach and messaging for the war effort, said Spencer Ackerman in Wired. A new “Web Content/Social Media Manager” will work with the U.S. military command in Afghanistan, known by the acronym USFOR-A, to spruce up and maintain “the command’s official website and related social media platforms, such as Facebook, Twitter, YouTube and Flickr.” Other officials will dig into the Afghan security ministries to advise key officials how to convince people they’re competent, energetic and not at all corrupt.

To non-Afghan eyes, USFOR-A’s got a pretty robust social media presence. Check out how often it tweets its messaging on Twitter. Its YouTube channel is filled with positive videos, and its Facebook page — folded into the NATO command’s page — has nearly 80,000 Likes. Is the war won yet?

Evidently not. The solicitation sees the Taliban doing a better communications job than the U.S.: ”To date, the Insurgents (INS) have undermined the credibility of USFOR-A, the International Community (IC), and Government of the Islamic Republic of Afghanistan (GIRoA) through effective use of the information environment, albeit without a commensurate increase in their own credibility.” Guess the Army thinks the Taliban’s recent English-language tweeting and SMS terror campaign is having an impact. Or that Gen. Stanley McChrystal’s 2009 plea to revamp the war’s communications apparatus didn’t have the desired effect.

That problem’s magnified when it comes to the Afghan government, which is so corrupt that Ryan Crocker, Obama administration’s nominee for ambassador to Kabul, compared its perfidies to a “second insurgency” on Wednesday. The answer? “[C]ulturally-astute and culturally-attuned communication and public affairs advisement” to mouthpieces for the ministries of Defense and Interior.

What will those advisers do? The short answer is teach them how to spin. The long answer: “better align media reporting and public perception and proactively engage opinion-shapers, from media to key leaders, in order to bring these attributes of the information landscape into alignment.”

This is only partially about gaining or keeping Afghan support. The bolstered social networking push needs to have rapid translation into Dari and Pashto, as well as ceaselessly nimble translations of the local press so the military gets feedback, the solicitation says. But it’s primarily to “inform key audiences” — that is, “media and civilian populations internationally and within the region” about USFOR-A spin. And when the best that the smooth diplomat Crocker can tell the Senate about the war is that it’s “not… hopeless,” it’s no wonder that the Army thinks USFOR-A needs all the communications help it can get.

- As seen in Wired

Google: Changing how people think?

A new Columbia University study found that the use of Internet search engines alters the way the brain stores information.

Is Google making human memory obsolete? asked Matt Peckham in Time.com. That’s the question raised by a new Columbia University study, which found that the use of Internet search engines like Google and Yahoo changes the way the brain stores information. In a series of experiments, researchers found that student subjects quickly forgot information they’d entered into a computer, if they believed they could just retrieve it from the computer later. In another test, subjects were asked to remember a string of facts and which folders these facts were stored in. To the researchers’ surprise, the subjects recalled the correct folders—but not the information itself. What this study reveals, said Kari Lipschutz in Adweek, is that we’re adapting to a powerful new technology by altering how we think. In effect, “Google is becoming your brain’s external hard drive.”

As one who likes my brain the way it is, said Jakob Nielsen in Businessweek.com, I find this pretty alarming. It’s certainly convenient to, say, pull up any historical fact in a microsecond. But for a sense of the relative strength of European navies during the Renaissance, and how the struggle for power in that era has shaped the modern world, I still read a book or two—and weave that information into my memory. That’s what you call learning, and it’s what leads to “deep understanding.” The Web has its uses, but mainly, it “fragments information into tiny nuggets that can be digested in a two-minute visit.”

Socrates made a similar complaint in 370 B.C., said Ronald Bailey in Reason. That was long before Google, of course, but back then, the Greek philosopher was worried that writing was making human beings dumber. The written word transmits merely “the appearance of wisdom,” Socrates said, arguing that it would diminish the importance of memory and extemporaneous speech. He was wrong. So are the people who think Google will make us illiterate and shallow, said David Alan Grier in Businessweek.com. Over the last decade, Google and the Internet have raised research standards, stimulated political argument and discussion of thousands of topics, and given any individual with a computer instant access to an ever-expanding body of human knowledge. Does Google “provide all the information that we will ever need?” Of course not. Does it, on the whole, make us smarter? Sure it does.

- As seen in The Week