Monthly Archives: April 2017

In the Future, Machines Will Borrow Our Brain’s Best Tricks

Steve sits up and takes in the crisp new daylight pouring through the bedroom window. He looks down at his companion, still pretending to sleep. “Okay, Kiri, I’m up.”

She stirs out of bed and begins dressing. “You received 164 messages overnight. I answered all but one.”

In the bathroom, Steve stares at his disheveled self. “Fine, give it to me.”

“Your mother wants to know why you won’t get a real girlfriend.”

He bursts out laughing. “Anything else?”

“Your cholesterol is creeping up again. And there have been 15,712 attempts to hack my mind in the last hour.”

“Good grief! Can you identify the source?”

“It’s distributed. Mostly inducements to purchase a new RF oven. I’m shifting ciphers and restricting network traffic.”

“Okay. Let me know if you start hearing voices.” Steve pauses. “Any good deals?”

“One with remote control is in our price range. It has mostly good reviews.”

“You can buy it.”

Kiri smiles. “I’ll stay in bed and cook dinner with a thought.”

Steve goes to the car and takes his seat.

Car, a creature of habit, pulls out and heads to work without any prodding.

Leaning his head back, Steve watches the world go by. Screw the news. He’ll read it later.

Car deposits Steve in front of his office building and then searches for a parking spot.

Steve walks to the lounge, grabs a roll and some coffee. His coworkers drift in and chat for hours. They try to find some inspiration for a new movie script. AI-generated art is flawless in execution, even in depth of story, but somehow it doesn’t resonate well with humans, much as one generation’s music does not always appeal to the next. AIs simply don’t share the human condition.

But maybe they could if they experienced the world through a body. That’s the whole point of the experiment with Kiri.…

It’s sci-fi now, but by midcentury we could be living in Steve and Kiri’s world. Computing, after about 70 years, is at a momentous juncture. The old approaches, based on CMOS technology and the von Neumann architecture, are reaching their fundamental limits. Meanwhile, massive efforts around the world to understand the workings of the human brain are yielding new insights into one of the greatest scientific mysteries: the biological basis of human cognition.

The dream of a thinking machine—one like Kiri that reacts, plans, and reasons like a human—is as old as the computer age. In 1950, Alan Turing proposed to test whether machines can think, by comparing their conversation with that of humans. He predicted computers would pass his test by the year 2000. Computing pioneers such as John von ­Neumann also set out to imitate the brain. They had only the simplest notion of neurons, based on the work of neuro­scientist ­Santiago Ramón y Cajal and others in the late 1800s. And the dream proved elusive, full of false starts and blind alleys. Even now, we have little idea how the tangible brain gives rise to the intangible experience of conscious thought.

Today, building a better model of the brain is the goal of major government efforts such as the BRAIN Initiative in the United States and the Human Brain Project in Europe, joined by private efforts such as those of the Allen Institute for Brain Science, in Seattle. Collectively, these initiatives involve hundreds of researchers and billions of dollars.

With systematic data collection and rigorous insights into the brain, a new generation of computer pioneers hopes to create truly thinking machines.

If they succeed, they will transform the human condition, just as the Industrial Revolution did 200 years ago. For nearly all of human history, we had to grow our own food and make things by hand. The Industrial Revolution unleashed vast stores of energy, allowing us to build, farm, travel, and communicate on a whole new scale. The AI revolution will take us one enormous leap further, freeing us from the need to control every detail of operating the machines that underlie modern civilization. And as a consequence of copying the brain, we will come to understand ourselves in a deeper, truer light. Perhaps the first benefits will be in mental health, organizational behavior, or even international relations.

Such machines will also improve our health in general. Imagine a device, whether a robot or your cellphone, that keeps your medical records. Combining this personalized data with a sophisticated model of all the pathways that regulate the human body, it could simulate scenarios and recommend healthy behaviors or medical actions tailored to you. A human doctor can correlate only a few variables at once, but such an app could consider thousands. It would be more effective and more personal than any physician.

Rigetti Launches Full-Stack Quantum Computing Service and Quantum IC Fab

Much of the ongoing quantum computing battle among tech giants such as Google and IBM has focused on developing the hardware necessary to solve impossible classical computing problems. A Berkeley-based startup looks to beat those larger rivals with a one-two combo: a fab lab designed for speedy creation of better quantum circuits and a quantum computing cloud service that provides early hands-on experience with writing and testing software.

Rigetti Computing recently unveiled its Fab-1 facility, which will enable its engineers to rapidly build new generations of quantum computing hardware based on quantum bits, or qubits. The facility can spit out entirely new designs for 3D-integrated quantum circuits within about two weeks—much faster than the months usually required for academic research teams to design and build new quantum computing chips. It’s not so much a quantum computing chip factory as it is a rapid prototyping facility for experimental designs.

“We’re fairly confident it’s the only dedicated quantum computing fab in the world,” says Andrew Bestwick, director of engineering at Rigetti Computing. “By the standards of industry, it’s still quite small and the volume is low, but it’s designed for extremely high-quality manufacturing of these quantum circuits that emphasizes speed and flexibility.”

But Rigetti is not betting on faster hardware innovation alone. It has also announced its Forest 1.0 service that enables developers to begin writing quantum software applications and simulating them on a 30-qubit quantum virtual machine. Forest 1.0 is based on Quil—a custom instruction language for hybrid quantum/classical computing—and open-source python tools intended for building and running Quil programs.

By signing up for the service, both quantum computing researchers and scientists in other fields will get the chance to begin practicing how to write and test applications that will run on future quantum computers. And it’s likely that Rigetti hopes such researchers from various academic labs or companies could end up becoming official customers.

“We’re a full stack quantum computing company,” says Madhav Thattai, Rigetti’s chief strategy officer. “That means we do everything from design and fabrication of quantum chips to packaging the architecture needed to control the chips, and then building the software so that people can write algorithms and program the system.”

Much still has to be done before quantum computing becomes a practical tool for researchers and companies. Rigetti’s approach to universal quantum computing uses silicon-based superconducting qubits that can take advantage of semiconductor manufacturing techniques common in today’s computer industry. That means engineers can more easily produce the larger arrays of qubits necessary to prove that quantum computing can outperform classical computing—a benchmark that has yet to be reached.

Google researchers hope to demonstrate such “quantum supremacy” over classical computing with a 49-qubit chip by the end of 2017. If they succeed, it would be an “incredibly exciting scientific achievement,” Bestwick says. Rigetti Computing is currently working on scaling up from 8-qubit chips.

But even that huge step forward in demonstrating the advantages of quantum computing would not result in a quantum computer that is a practical problem-solving tool. Many researchers believe that practical quantum computing requires systems to correct the quantum errors that can arise in fragile qubits. Error correction will almost certainly be necessary to achieve the future promise of 100-million-qubit systems that could perform tasks that are currently impractical, such as cracking modern cryptography keys.

Though it may seem like quantum computing demands far-off focus, Rigetti Computing is complementing its long-term strategy with a near-term strategy that can serve clients long before more capable quantum computers arise. The quantum computing cloud service is one example of that. The startup also believes a hybrid system that combines classical computing architecture with quantum computing chips can solve many practical problems in the short term, especially in the fields of machine learning and chemistry. What’s more, says Rigetti, such hybrid classical/quantum computers can perform well even without error correction.

Tech TalkComputingHardware Global Race Toward Exascale Will Drive Supercomputing

For the first time in 21 years, the United States no longer claimed even the bronze medal. With this week’s release of the latest Top 500 supercomputer ranking, the top three fastest supercomputers in the world are now run by China (with both first and second place finishers) and Switzerland. And while the supercomputer horserace is spectacle enough unto itself, a new report on the supercomputer industry highlights broader trends behind both the latest and the last few years of Top500 rankings.

The report, commissioned last year by the Japanese national science agency Riken, outlines a worldwide race toward exascale computers in which the U.S. sees R&D spending and supercomputer talent pools shrink, Europe jumps into the breach with increased funding, and China pushes hard to become the new global leader, despite a still small user and industry base ready to use the world’s most powerful supercomputers.

Steve Conway, report co-author and senior vice president of research at Hyperion, says the industry trend in high-performance computing is toward laying groundwork for pervasive AI and big data applications like autonomous cars and machine learning. And unlike more specialized supercomputer applications from years past, the workloads of tomorrow’s supercomputers will likely be mainstream and even consumer-facing applications.

“Ten years ago the rationale for spending on supercomputers was primarily two things: national security and scientific leadership, and I think there are a lot of people who still think that supercomputers are limited to problems like will a proton go left or right,” he says. “But in fact, there’s been strong recognition [of the connections] between supercomputing leadership and industrial leadership.”

“With the rise of big data, high-performance computing has moved to the forefront of research in things like autonomous vehicle design, precision medicine, deep learning, and AI,” Conway says. “And you don’t have to ask supercomputing companies if this is true. Ask Google and Baidu. There’s a reason why Facebook has already bought 26 supercomputers.”

As the 72-page Hyperion report notes, “IDC believes that countries that fail to fund development of these future leadership-class supercomputers run a high risk of falling behind other highly developed countries in scientific innovation, with later harmful consequences for their national economies.” (Since authoring the report in 2016 as part of the industry research group IDC, its authors this year formed the spin-off research firm Hyperion.)

Conway says that solutions to problems plaguing HPC systems today will be found in consumer electronics and industry applications of the future. So while operating massively parallel computers with multiple millions of cores may today only be a problem facing the world’s fastest and second-fastest supercomputers—China’s Sunway TaihuLight and Tianhe-2, running on 10.6 and 3.1 million cores, respectively—that fact won’t hold true forever. However, because China is the only country tackling this problem now means they are more likely to develop the technology first, technology that the world will want when cloud computing with multiple millions of cores approaches the mainstream.

The same logic applies to optimizing the ultra-fast data rates that today’s top HPC systems use and minimizing the megawatt electricity budgets they consume. And as the world’s supercomputers approach the exascale, that is, the 1 exaflop or 1000 petaflop mark, new challenges will no doubt arise too.

So, for instance, the report says that rapid shut-down and power-up of cores not in use will be one trick supercomputer designers use to trim back some of their systems’ massive power budgets. And, too, high-storage density—in the 100 petabyte range—will become paramount to house the big datasets the supercomputers consume.

“You could build an exascale system today,” Conway says. “But it would take well over 100 megawatts, which nobody’s going to supply, because that’s over a 100 million dollar electricity bill. So it has to get the electricity usage under control. Everybody’s trying to get it in the 20 to 30 megawatts range. And it has to be dense. Much denser than any computing today. It’s got to fit inside some kind of building. You don’t want the building to be 10 miles long. And also the denser the machine, the faster the machine is going to be too.”

Conway predicts that these and other challenges will be surmounted, and the first exaflop supercomputers will appear on the Top500 list around 2021, while exaflop supercomputing could become commonplace by 2023.

A Search Engine for the Brain Is in Sight

The human brain is smaller than you might expect: One of them, dripping with formaldehyde, fits in a single gloved hand of a lab supervisor here at the Jülich Research Center, in Germany.

Soon, this rubbery organ will be frozen solid, coated in glue, and then sliced into several thousand wispy slivers, each just 60 micrometers thick. A custom apparatus will scan those sections using 3D polarized light imaging (3D-PLI) to measure the spatial orientation of nerve fibers at the micrometer level. The scans will be gathered into a colorful 3D digital reconstruction depicting the direction of individual nerve fibers on larger scales—roughly 40 gigabytes of data for a single slice and up to a few petabytes for the entire brain. And this brain is just one of several to be scanned.

Neuroscientists hope that by combining and exploring data gathered with this and other new instruments they’ll be able to answer fundamental questions about the brain. The quest is one of the final frontiers—and one of the greatest challenges—in science.

Imagine being able to explore the brain the way you explore a website. You might search for the corpus callosum—the stalk that connects the brain’s two hemispheres—and then flip through individual nerve fibers in it. Next, you might view networks of cells as they light up during a verbal memory test, or scroll through protein receptors embedded in the tissue.