Tuesday, June 23, 2015


(Image: Siri Stafford/Getty)
Thanks to the latest advances in computer vision, we now have machines that can pick you out of a line-up. But what if your face is hidden from view?
An experimental algorithm out of Facebook's artificial intelligence lab can recognise people in photographs even when it can't see their faces. Instead it looks for other unique characteristics like your hairdo, clothing, body shape and pose.
Modern face-recognition algorithms are so good they've already found their way into social networks, shops and even churches. Yann LeCun, head of artificial intelligence at Facebook, wanted to see they could be adapted to recognise people in situations where someone's face isn't clear, something humans can already do quite well.
"There are a lot of cues we use. People have characteristic aspects, even if you look at them from the back," LeCun says. "For example, you can recognise Mark Zuckerberg very easily, because he always wears a gray T-shirt."
The research team pulled almost 40,000 public photos from Flickr - some of people with their full face clearly visible, and others where they were turned away - and ran them through a sophisticated neural network.
The final algorithm was able to recognise individual people's identities with 83 per cent accuracy. It was presented earlier this month at the Computer Vision and Pattern Recognition conference in Boston, Massachusetts.
An algorithm like this could one day help power photo apps like Facebook's Moments, released last week.
Moments scours through a phone's photos, sorting them into separate events like a friend's wedding or a trip to the beach and tagging whoever it recognises as a Facebook friend. LeCun also imagines such a tool would be useful for the privacy-conscious - alerting someone whenever a photo of themselves, however obscured, pops up on the internet.
The flipside is also true: the ability to identify someone even when they are not looking at the camera raises some serious privacy implications. Last week, talks over rules governing facial recognition collapsed after privacy advocates and industry groups could not agree.
"If, even when you hide your face, you can be successfully linked to your identify, that will certainly concern people," says Ralph Gross at Carnegie Mellon University in Pittsburgh, Pennsylvania, who says the algorithm is impressive. "Now is a time when it's important to discuss these questions."

Facebook can recognise you in photos even if you're not looking

All together now: yeasts can evolve to form snowflake-like multicellular shapes (Image: Courtesy of Jennifer Pentz, Georgia Tech)

The leap from single-celled life to multicellular creatures is easier than we ever thought. And it seems there's more than one way it can happen.

The mutation of a single gene is enough to transform single-celled brewer's yeast into a "snowflake" that evolves as a multicellular organism.

Similarly, single-celled algae quickly evolve into spherical multicellular organisms when faced with predators that eat single cells.

These findings back the emerging idea that this leap in complexity isn't the giant evolutionary hurdle it was thought to be.

At some point after life first emerged, some cells came together to form the first multicellular organism. This happened perhaps as early as 2.1 billion years ago. Others followed – multicellularity is thought to have evolved independently at least 20 times – eventually giving rise to complex life, such as humans.

But no organism is known to have made that transition in the past 200 million years, so how and why it happened is hard to study.

Special snowflake

Back in 2011, evolutionary biologists William Ratcliff and Michael Travisano at the University of Minnesota in St Paul coaxed unicellular yeast to take on a multicellular "snowflake" form by taking the fastest-settling yeast out of a culture and using it to found new cultures. And then repeating the process. Because clumps of yeast settle faster than individual cells, this effectively selected yeast that stuck together instead of separating after cell division.

The team's latest work shows that this transformation from a single to multicellular existence can be driven by a single gene calledACE2 that controls separation of daughter cells after division, Ratcliff told the 15-19 June Astrobiology Science Conference in Chicago.

And because the snowflake grows in a branching, tree-like pattern, any later mutations are confined to single branches. When the original snowflake gets too large and breaks up, these mutant branches fend for themselves, allowing the value of their new mutation to be tested in the evolutionary arena.

"A single mutation creates groups that as a side effect are capable of Darwinian evolution at the multicellular level," says Ratcliff, who is now at Georgia Tech University in Atlanta.

Bigger is better

Ratcliff's team has previously also evolved multicellularity in single-celled algae calledChlamydomonas, through similar selection for rapid settling. The algal cells clumped together in amorphous blobs.

Now the feat has been repeated, but with predators thrown into the mix. A team led byMatt Herron of the University of Montana in Missoula exposed Chlamydomonas to a paramecium, a single-celled protozoan that can devour single-celled algae but not multicellular ones.

Safety in even numbers (Image: Jacob Boswell)

Sure enough, two of Herron's five experimental lines became multicellular within six months, or about 600 generations, he told the conference.

This time, instead of daughter cells sticking together in an amorphous blob as they did under selection for settling, the algae formed predation-resistant, spherical units of four, eight or 16 cells that look almost identical to related species of algae that are naturally multicellular.

"It's likely that what we've seen in the predation experiments recapitulates some of the early steps of evolution," says Herron.

Neither Ratcliff's yeast nor Herron's algae has unequivocally crossed the critical threshold to multicellularity, which would require cells to divide labour between them, says Richard Michod of the University of Arizona in Tucson.

But the experiments are an important step along that road. "They're opening up new avenues for approaching this question," he says.

One gene may drive leap from single cell to multicellular life

Will more sensory substitution devices hit the market soon?

The BrainPort V100

Courtesy Wicab, Inc.

Last week, the Food and Drug Administration (FDA) announced that medical device company Wicab is allowed to market a new device that will help the blind “see.” The device, called theBrainPort V100, can help the blind navigate by processing visual information and communicating it to the user through electrodes on his tongue. Though this isn’t the first device to go on the market using sensory substitution (where information perceived by one sense is communicated through another), the sophistication and usability of the BrainPort V100 could mean that the number of sensory substitution devices permitted by the FDA is on the rise.

The BrainPort V100 consists of a pair of dark glasses and tongue-stimulating electrodes connected to a handheld battery-operated device. When cameras in the glasses pick up visual stimuli, software converts the information to electrical pulses sent as vibrations to be felt on the user’s tongue. Like most sensory substitution devices, “seeing” with your tongue may not be intuitive at first. But the researchers who developed the device tested it over the course of a year, training users to interpret the vibrations. Studies showed that 69 percent of the test subjects were able to identify an object using the BrainPort device after a year of training. However, the device is expensive; Wicab toldPopular Science that it will cost $10,000 per unit, the same as its price when first reported back in 2009.

Researchers have been fiddling withsensory substitution for a long time, but most of these devices are not yet widely available. The BrainPort V100 will be on one of the first, having passed the FDA’s review through recently-updated guidelines called the premarket review pathway: “a regulatory pathway for some low- to moderate-risk medical devices that are not substantially equivalent to an already legally-marketed device,” according to the press release. Since this device is now allowed to be marketed and was approved relatively quickly through these new guidelines, the BrainPort may be paving the way for an explosion of sensory substitution devices to hit the market in the next few years, which could help the growing numbers of Americans with sensory impairments.

Device That Helps Blind People See With Their Tongues Just Won FDA Approval

Technology can reconstruct video based on a person's thoughts and even anticipate your moves while you drive. Now, a brain-to-text system can translate brain activity into written words.

In a recent study in Frontiers in Neuroscience, seven patients had electrode sheets placed on their brain which collected neural data while they read passages aloud from the Gettysburg Address, JFK’s inaugural speech, and Humpty Dumpty.

As each patient spoke, a computer algorithm learned to associate speech sounds—such as "foh", "net", and "ik"—with different firing patterns in the brain cells. Eventually it learned to read the brain cells well enough that it could guess which sound they were producing with up to 75 percent accuracy. But the program doesn't need 100 percent accuracy to put those sounds together into the word "phonetic". Because our speech only takes certain forms, the system’s algorithm can correct for these errors “just like autocorrect,” says Peter Brunner, one of the co-authors of the study.

“Siri wouldn’t be more accurate than 50 or 70 percent,” he says. “Because it knows what the potential options are that you choose, or the typical sentences that you say, it can actually utilize this information to get the right choice.”

It is important to record the data directly from the brain, says Brunner, because picking up neural activity from the scalp only gives a “blurred version” of what is happening in the brain. He likened the latter method to flying 1000 feet above a baseball stadium and only being able to vaguely recognize that people are cheering, but not the specifics of what the people’s faces look like.

In this case, the patients were already undergoing an epilepsy procedure where the skull is popped open and an electrode grid is placed on the brain to map areas where neurons are misfiring. The resourceful team of researchers from the National Center for Adaptive Neurotechnologies and the State University of New York at Albany used this time to conduct their own research. However, it means study was limited by each patient’s individualized epilepsy treatment, such as where the electrodes were placed on the brain.

Because every person’s brain is so unique, and the neural activity must be picked up directly from the brain, it would be difficult to create a general brain-to-text device for the average consumer, says Brunner. However, this technology has a lot of potential to be used for people who suffer from neurological diseases, such as ALS, who lose the ability to move and to speak. Instead of using an external device like Steven Hawking to pick out words on a screen for a computer to read, the computer would simply speak your mind.

“This is just the beginning,” said Brunner. “The prospects of this are really endless.”

Mind-Reading Program Translates Thoughts Into Text

 
Hi-Tech Talk © 2015 - Designed by Templateism.com