Toward optical quantum computing

Ordinarily, light particles — photons — don’t interact. If two photons collide in a vacuum, they simply pass through each other.

An efficient way to make photons interact could open new prospects for both classical optics and quantum computing, an experimental technology that promises large speedups on some types of calculations.

In recent years, physicists have enabled photon-photon interactions using atoms of rare elements cooled to very low temperatures.

But in the latest issue of Physical Review Letters, MIT researchers describe a new technique for enabling photon-photon interactions at room temperature, using a silicon crystal with distinctive patterns etched into it. In physics jargon, the crystal introduces “nonlinearities” into the transmission of an optical signal.

“All of these approaches that had atoms or atom-like particles require low temperatures and work over a narrow frequency band,” says Dirk Englund, an associate professor of electrical engineering and computer science at MIT and senior author on the new paper. “It’s been a holy grail to come up with methods to realize single-photon-level nonlinearities at room temperature under ambient conditions.”

Joining Englund on the paper are Hyeongrak Choi, a graduate student in

Tips Uninstall MS-Office 2016

Microsoft Office is one of the most common suites of application, every computer user is using for home and business PC. The package includes MS Word, PowerPoint, Excel, Outlook, Access, Publisher and other applications. With time, Microsoft has launched various versions of MS Office. Every new version came up with some of the exciting features that enabled the users to do a number of tasks without any hassle and at the same platform. Today, you can install the MS Office on Windows OS, Mac OS or mobile phone.

Already using MS-Office 2016 but want to upgrade it to the latest version? For this, you need to uninstall the previously installed Office from your device (laptop, desktop or 2-in-1). You would also want to uninstall MS Office if there is some error in its working. However, to do so you don’t need to be a pro. This article will provide you the complete procedure for the same. Have a look:

  • The first step is to select your operating system
  • Download the Office uninstall tool, i.e. Microsoft Fix It
  • Run the Fix It and follow the on-screen instructions
  • Click “Apply this fix” and run

Computer Hardware Firms Slip

Almost half of the computer hardware companies surveyed in a recent report had earnings returns that trailed even passbook savings accounts last year, according to an investment banking firm specializing in information technology company mergers.

The other side of the technology coin, information services companies-which include firms involved in software, publishing, printing and related supplies, marketing services, database and computer processing-fared better.

As a result, investors have fallen out of love with the high-tech hardware firms, once considered the darlings of Wall Street. The median market-price to book-value ratio was 1.4 to 1, the study found, which is about the same as representative cross-industry indexes. For information services companies, the ratio is 2.5 to 1.

Harvey Poppel, a partner at Broadview Associates, Ft. Lee, N.J., said his firm studied more than 463 publicly held U.S. information technology firms in both the hardware and information services sectors, with total revenues of more than $200 billion.

About 44 percent of the companies classified as computer and communications hardware firms had returns on

GeIL Releases EVO SPEAR DDR4 Hardcore Memory

GeIL – Golden Emperor International Ltd. – one of the world’s leading PC components & peripheral manufacturers announced the latest DDR4 hardcore gaming memory modules, EVO SPEAR Series, and EVO SPEAR AMD Edition Series. Available as single modules and kits up to 64GB, runs as low as 1.2V and at max 1.35V resulting in less power consumption and higher reliability. Featuring a stylish and stealthy standard-height heat spreader, EVO SPEAR Series modules can be used in most case designs including SFF (Small Form Factor) systems and full-sized gaming PCs. The EVO SPEAR Series and EVO SPEAR AMD Edition Series are available in black with a black PCB and are perfect for gamers and enthusiasts who want a cost-efficient upgrade for faster gaming, video editing, and 3D rendering.

EVO SPEAR Series available in 2133MHz to 3466MHz frequencies, optimized for the Intel® Core™ X, i7, i5 Processors as well as Z200 and X299 Series Chipset.

EVO SPEAR AMD Edition Series fully compatible with the latest AMD Ryzen 7, Ryzen 5 Processors and AM4 Motherboards. Available in 2133MHz to 3200MHz frequencies.

“EVO SPEAR Series, and EVO SPEAR AMD Edition Series are ideal for gamers, enthusiasts, and case modders looking to

Fractal Design Launches New Tempered Glass Define C Chassis

So many cases on the market today are made to be all things to all people. However, for many this results in a chassis full of empty bays, unused mounts and excess bulk. Created for those who demand a flexible platform for a powerful ATX or Micro ATX build that wastes no space, the Define C TG Series is the perfect solution to satisfy this balance of capacity and efficiency while opening up the side thanks to a full tempered glass side panel.

Smaller than the usual ATX and Micro ATX case, the Define C TG and Define Mini C TG with its optimized interior provides the perfect base for users. The open air design offers unobstructed airflow across your core components with high performance and silent computing in mind at every step.

Extensive cooling support via both air and water are offered to make sure even the most powerful systems can be cooled effectively. Carrying signature Define series traits, the Define C TG Series brings with it that iconic front panel design, dense sound dampening material throughout and ModuVent technology in the top panel. Those wanting to remove the ModuVent to add more fans or

Why Hardware Engineers Should Think Like Cybercriminals

The future of cybersecurity is in the hands of hardware engineers. That’s what Scott Borg, director of the U.S. Cyber Consequences Unit, told 130 chief technical officers, engineering directors, and key researchers from MEMS and sensors companies and laboratories Thursday morning.

Borg, speaking at the MEMS and Sensors Technical Congress, held on the campus of Stanford University, warned that “the people in this room are now moving into the crosshairs of cyberhackers in a way that has never happened before.”

And Borg should know. He and his colleagues at the Cyber Consequences Unit (a nonprofit research institute) predicted the Stuxnet attack and some major developments in cybercrime over the last 15 years.

Increasingly, hackers are focusing on hardware, not on software, particularly equipment in industry, he indicated.

“Initially,” he said, “they focused on operations control, monitoring different locations from a central site. Then they moved to process control, including programmable logic controllers and local networks. Then they migrated to embedded devices and the ability to control individual pieces of equipment. Now they are migrating to the actual sensors, the MEMS devices.”

“You can imagine countless attacks manipulating physical things,” Borg said. And imagining those things definitely keeps him up at night—it’s

Bad at Math, Good at Everything Else

Painful exercises in basic arithmetic are a vivid part of our elementary school memories. A multiplication like 3,752 × 6,901 carried out with just pencil and paper for assistance may well take up to a minute. Of course, today, with a cellphone always at hand, we can quickly check that the result of our little exercise is 25,892,552. Indeed, the processors in modern cellphones can together carry out more than 100 billion such operations per second. What’s more, the chips consume just a few watts of power, making them vastly more efficient than our slow brains, which consume about 20 watts and need significantly more time to achieve the same result.

Of course, the brain didn’t evolve to perform arithmetic. So it does that rather badly. But it excels at processing a continuous stream of information from our surroundings. And it acts on that information—sometimes far more rapidly than we’re aware of. No matter how much energy a conventional computer consumes, it will struggle with feats the brain finds easy, such as understanding language and running up a flight of stairs.

If we could create machines with the computational capabilities and energy efficiency of the brain, it would be a

Can We Quantify Machine Consciousness?

Imagine that at some time in the not-too-distant future, you’ve bought a smartphone that comes bundled with a personal digital assistant (PDA) living in the cloud. You assign a sexy female voice to the PDA and give it access to all of your emails, social media accounts, calendar, photo album, contacts, and other bits and flotsam of your digital life. She—for that’s how you quickly think of her—knows you better than your mother, your soon-to-be ex-wife, your friends, or your therapist. Her command of English is flawless; you have endless conversations about daily events; she gets your jokes. She is the last voice you hear before you drift off to sleep and the first upon awakening. You panic when she’s off-line. She becomes indispensable to your well-being and so, naturally, you fall in love. Occasionally, you wonder whether she truly reciprocates your feelings and whether she is even capable of experiencing anything at all. But the warm, husky tone of her voice and her ability to be that perfect foil to your narcissistic desires overcome these existential doubts. Alas, your infatuation eventually cools off after you realize she is carrying on equally intimate conversations with thousands of other customers.

The Benefits of Building an Artificial Brain

In the mid-1940s, a few brilliant people drew up the basic blueprints of the computer age. They conceived a general-purpose machine based on a processing unit made up of specialized subunits and registers, which operated on stored instructions and data. Later inventions—transistors, integrated circuits, solid-state memory—would supercharge this concept into the greatest tool ever created by humankind.

So here we are, with machines that can churn through tens of quadrillions of operations per second. We have voice-recognition-enabled assistants in our phones and homes. Computers routinely thrash us in our ancient games. And yet we still don’t have what we want: machines that can communicate easily with us, understand and anticipate our needs deeply and unerringly, and reliably navigate our world.

Now, as Moore’s Law seems to be starting some sort of long goodbye, a couple of themes are dominating discussions of computing’s future. One centers on quantum computers and stupendous feats of decryption, genome analysis, and drug development. The other, more interesting vision is of machines that have something like human cognition. They will be our intellectual partners in solving some of the great medical, technical, and scientific problems confronting humanity. And their thinking may share some of the fantastic and maddening beauty, unpredictability,

We Could Build an Artificial Brain Right Now

Brain-inspired computing is having a moment. Artificial neural network algorithms like deep learning, which are very loosely based on the way the human brain operates, now allow digital computers to perform such extraordinary feats as translating language, hunting for subtle patterns in huge amounts of data, and beating the best human players at Go.

But even as engineers continue to push this mighty computing strategy, the energy efficiency of digital computing is fast approaching its limits. Our data centers and supercomputers already draw megawatts—some 2 percent of the electricity consumed in the United States goes to data centers alone. The human brain, by contrast, runs quite well on about 20 watts, which represents the power produced by just a fraction of the food a person eats each day. If we want to keep improving computing, we will need our computers to become more like our brains.

Hence the recent focus on neuromorphic technology, which promises to move computing beyond simple neural networks and toward circuits that operate more like the brain’s neurons and synapses do. The development of such physical brainlike circuitry is actually pretty far along. Work at my lab and others around the world over the

U.S. Slips in New Top500 Supercomputer Ranking

In June, we can look forward to two things: the Belmont Stakes and the first of the twice-yearly TOP500 rankings of supercomputers. This month, a well-known gray and black colt named Tapwrit came in first at Belmont, and a well-known gray and black supercomputer named Sunway TaihuLight came in first on June’s TOP500 list, released today in conjunction with the opening session of the ISC High Performance conference in Frankfurt. Neither was a great surprise.

Tapwrit was the second favorite at Belmont, and Sunway TaihuLight was the clear pick for the number-one position on TOP500 list, it having enjoyed that first-place ranking since June of 2016 when it beat out another Chinese supercomputer, Tianhe-2. The TaihuLight, capable of some 93 petaflops in this year’s benchmark tests, was designed by the National Research Center of Parallel Computer Engineering & Technology (NRCPC) and is located at the National Supercomputing Center in Wuxi, China. Tianhe-2, capable of almost 34 petaflops, was developed by China’s National University of Defense Technology (NUDT), is deployed at the National Supercomputer Center in Guangzho, and still enjoys the number-two position on the list.

More of a surprise, and perhaps more of a disappointment for some, is that the highest-ranking U.S. contender, the Department

Raspberry Pi Merger With CoderDojo Isn’t All It Seems

This past Friday, the Raspberry Pi Foundation and the CoderDojo Foundationbecame one. The Raspberry Pi Foundation described it as “a merger that will give many more young people all over the world new opportunities to learn how to be creative with technology.” Maybe. Or maybe not. Before I describe why I’m a bit skeptical, let me first take a moment to explain more about what these two entities are.

The Raspberry Pi Foundation is a charitable organization created in the U.K. in 2009. Its one-liner mission statement says it works to “put the power of digital making into the hands of people all over the world.” In addition to designing and manufacturing an amazingly popular line of inexpensive single-board computers—the Raspberry Pi—the Foundation has also worked very hard at providing educational resources.

The CoderDojo Foundation is an outgrowth of a volunteer-led, community-based programming club established in Cork, Ireland in 2011. That model was later cloned in many other places and can now be found in 63 countries, where local coding clubs operate under the CoderDojo banner.

So both organizations clearly share a keen interest in having young people learn about computers and coding. Indeed, the Raspberry Pi Foundation had earlier

In the Future, Machines Will Borrow Our Brain’s Best Tricks

Steve sits up and takes in the crisp new daylight pouring through the bedroom window. He looks down at his companion, still pretending to sleep. “Okay, Kiri, I’m up.”

She stirs out of bed and begins dressing. “You received 164 messages overnight. I answered all but one.”

In the bathroom, Steve stares at his disheveled self. “Fine, give it to me.”

“Your mother wants to know why you won’t get a real girlfriend.”

He bursts out laughing. “Anything else?”

“Your cholesterol is creeping up again. And there have been 15,712 attempts to hack my mind in the last hour.”

“Good grief! Can you identify the source?”

“It’s distributed. Mostly inducements to purchase a new RF oven. I’m shifting ciphers and restricting network traffic.”

“Okay. Let me know if you start hearing voices.” Steve pauses. “Any good deals?”

“One with remote control is in our price range. It has mostly good reviews.”

“You can buy it.”

Kiri smiles. “I’ll stay in bed and cook dinner with a thought.”

Steve goes to the car and takes his seat.

Car, a creature of habit, pulls out and heads to

Rigetti Launches Full-Stack Quantum Computing Service and Quantum IC Fab

Much of the ongoing quantum computing battle among tech giants such as Google and IBM has focused on developing the hardware necessary to solve impossible classical computing problems. A Berkeley-based startup looks to beat those larger rivals with a one-two combo: a fab lab designed for speedy creation of better quantum circuits and a quantum computing cloud service that provides early hands-on experience with writing and testing software.

Rigetti Computing recently unveiled its Fab-1 facility, which will enable its engineers to rapidly build new generations of quantum computing hardware based on quantum bits, or qubits. The facility can spit out entirely new designs for 3D-integrated quantum circuits within about two weeks—much faster than the months usually required for academic research teams to design and build new quantum computing chips. It’s not so much a quantum computing chip factory as it is a rapid prototyping facility for experimental designs.

“We’re fairly confident it’s the only dedicated quantum computing fab in the world,” says Andrew Bestwick, director of engineering at Rigetti Computing. “By the standards of industry, it’s still quite small and the volume is low, but it’s designed for extremely high-quality manufacturing of these quantum circuits that emphasizes speed and flexibility.”

But Rigetti is not betting on faster hardware innovation alone. It

Tech TalkComputingHardware Global Race Toward Exascale Will Drive Supercomputing

For the first time in 21 years, the United States no longer claimed even the bronze medal. With this week’s release of the latest Top 500 supercomputer ranking, the top three fastest supercomputers in the world are now run by China (with both first and second place finishers) and Switzerland. And while the supercomputer horserace is spectacle enough unto itself, a new report on the supercomputer industry highlights broader trends behind both the latest and the last few years of Top500 rankings.

The report, commissioned last year by the Japanese national science agency Riken, outlines a worldwide race toward exascale computers in which the U.S. sees R&D spending and supercomputer talent pools shrink, Europe jumps into the breach with increased funding, and China pushes hard to become the new global leader, despite a still small user and industry base ready to use the world’s most powerful supercomputers.

Steve Conway, report co-author and senior vice president of research at Hyperion, says the industry trend in high-performance computing is toward laying groundwork for pervasive AI and big data applications like autonomous cars and machine learning. And unlike more specialized supercomputer applications from years past, the workloads of tomorrow’s supercomputers will likely be mainstream and even consumer-facing

A Search Engine for the Brain Is in Sight

The human brain is smaller than you might expect: One of them, dripping with formaldehyde, fits in a single gloved hand of a lab supervisor here at the Jülich Research Center, in Germany.

Soon, this rubbery organ will be frozen solid, coated in glue, and then sliced into several thousand wispy slivers, each just 60 micrometers thick. A custom apparatus will scan those sections using 3D polarized light imaging (3D-PLI) to measure the spatial orientation of nerve fibers at the micrometer level. The scans will be gathered into a colorful 3D digital reconstruction depicting the direction of individual nerve fibers on larger scales—roughly 40 gigabytes of data for a single slice and up to a few petabytes for the entire brain. And this brain is just one of several to be scanned.

Neuroscientists hope that by combining and exploring data gathered with this and other new instruments they’ll be able to answer fundamental questions about the brain. The quest is one of the final frontiers—and one of the greatest challenges—in science.

Imagine being able to explore the brain the way you explore a website. You might search for the corpus callosum—the stalk that connects the brain’s two hemispheres—and then

Even Ordinary Computer Users Could Access Secret Quantum Computing

You may not need a quantum computer of your own to securely use quantum computing in the future. For the first time, researchers have shown how even ordinary classical computer users could remotely access quantum computing resources online while keeping their quantum computations securely hidden from the quantum computer itself.

Tech giants such as Google and IBM are racing to build universal quantum computers that could someday analyze millions of possible solutions much faster than today’s most powerful classical supercomputers. Such companies have also begun offering online access to their early quantum processors as a glimpse of how anyone could tap the power of cloud-based quantum computing. Until recently, most researchers believed that there was no way for remote users to securely hide their quantum computations from prying eyes unless they too possessed quantum computers. That assumption is now being challenged by researchers in Singapore and Australia through a new paper published in the 11 July issue of the journal Physical Review X.

“Frankly, I think we are all quite surprised that this is possible,” says Joseph Fitzsimons, a theoretical physicist for the Centre for Quantum Technologies at the National University of Singapore and principal investigator on the study. “There had been a number

The Real Future of Quantum Computing?

Instead of creating quantum computers based on qubits that can each adopt only two possible options, scientists have now developed a microchip that can generate “qudits” that can each assume 10 or more states, potentially opening up a new way to creating incredibly powerful quantum computers, a new study finds.

Classical computers switch transistors either on or off to symbolize data as ones and zeroes. In contrast, quantum computers use quantum bits, or qubits that, because of the bizarre nature of quantum physics, can be in a state of superposition where they simultaneously act as both 1 and 0.

The superpositions that qubits can adopt let them each help perform two calculations at once. If two qubits are quantum-mechanically linked, or entangled, they can help perform four calculations simultaneously; three qubits, eight calculations; and so on. As a result, a quantum computer with 300 qubits could perform more calculations in an instant than there are atoms in the known universe, solving certain problems much faster than classical computers. However, superpositions are extraordinarily fragile, making it difficult to work with multiple qubits.

Most attempts at building practical quantum computers rely on particles that serve as qubits. However, scientists have long known that they could in principle

Western Digital WD1402A UART

Gordon Bell is famous for launching the PDP series of minicomputers at Digital Equipment Corp. in the 1960s. These ushered in the era of networked and interactive computing that would come to full flower with the introduction of the personal computer in the 1970s. But while minicomputers as a distinct class now belong to the history books, Bell also invented a lesser known but no less significant piece of technology that’s still in action all over the world: The universal asynchronous receiver/transmitter, or UART.

UARTs are used to let two digital devices communicate with each other by sending bits one at a time over a serial interface without bothering the device’s primary processor with the details.

Today, more sophisticated serial setups are available, such as the ubiquitous USB standard, but for a time UARTs ruled supreme as the way to, for example, connect modems to PCs. And the simple UART still has its place, not least as the communication method of last resort with a lot of modern network equipment.

The UART was invented because of Bell’s own need to connect a Teletype to a PDP-1, a task that required converting parallel signals into serial signals. He cooked

Low-Cost Pliable Materials Transform Glove Into Sign-to-Text Machine

Researchers have made a low-cost smart glove that can translate the American Sign Language alphabet into text and send the messages via Bluetooth to a smartphone or computer. The glove can also be used to control a virtual hand.

While it could aid the deaf community, its developers say the smart glove could prove really valuable for virtual and augmented reality, remote surgery, and defense uses like controlling bomb-diffusing robots.

This isn’t the first gesture-tracking glove. There are companies pursuing similar devices that recognize gestures for computer control, à la the 2002 film Minority Report. Some researchers have also specifically developed gloves that convert sign language into text or audible speech.

What’s different about the new glove is its use of extremely low-cost, pliable materials, says developer Darren Lipomi, a nanoengineering professor at the University of California, San Diego. The total cost of the components in the system reported in the journal PLOS ONE cost less than US $100, Lipomi says. And unlike other gesture-recognizing gloves, which use MEMS sensors made of brittle materials, the soft stretchable materials in Lipomi’s glove should make it more robust.

The key components of the new glove are