Toggle light / dark theme

😗


From early detection and internal treatment of diseases to futuristic applications like augmenting human memory, biological computing, or biocomputing, has the potential to revolutionize medicine and computers.

Traditional computer hardware is limited in its ability to interface with living organs, which has constrained the development of medical devices. Computerized implants require a constant supply of electricity, they can cause scarring in soft tissue that makes them unusable and they cannot heal themselves the way organisms can. Through the use of biological molecules such as DNA or proteins, biocomputing has the potential to overcome these limitations.

Biocomputing is typically done either with or with non-living, enzyme-free molecules. Live cells can feed themselves and can heal, but it can be difficult to redirect cells from their ordinary functions toward computation. Non-living molecules solve some of the problems of live cells, but have weak output signals and are difficult to fine-tune and regulate.

A tiny vibrating crystal weighing little more than a grain of sand has become the heaviest object ever to be recorded in a superposition of locations.

Physicists at the Swiss Federal Institute of Technology (ETH) Zurich coupled a mechanical resonator to a type of superconducting circuit commonly used in quantum computing to effectively replicate Erwin Schrödinger’s famous thought experiment on an unprecedented scale.

Ironically, Schrödinger would be somewhat skeptical that anything so large – well, anything at all – could exist in a nebulous state of reality.

Encoding breakthrough allows for solving wider set of applications using neutral-atom quantum computers. QuEra Computing and university researchers have developed a method to expand the optimization calculations possible with neutral-atom quantum computers. This breakthrough, published in PRX Quantum, overcomes hardware limitations, enabling solutions to more complex problems, thus broadening applications in industries like logistics and pharmaceuticals.

Topological superconductors are superconducting materials with unique characteristics, including the appearance of so-called in-gap Majorana states. These bound states can serve as qubits, making topological superconductors particularly promising for the creation of quantum computing technologies.

Some physicists have recently been exploring the potential for creating that integrate superconductors with swirling configurations of atomic magnetic dipoles (spins), known as quantum crystals. Most of these efforts suggested sandwiching quantum skyrmion crystals between superconductors to achieve topological superconductivity.

Kristian MĂŠland and Asle SudbĂž, two researchers at the Norwegian University of Science and Technology, have recently proposed an alternative model system of topological superconductivity, which does not contain superconducting materials. This theoretical model, introduced in Physical Review Letters, would instead use a sandwich structure of a heavy metal, a , and a normal metal, where the induces a quantum skyrmion crystal in the magnetic insulator.

Advances in quantum computation for electronic structure, and particularly heuristic quantum algorithms, create an ongoing need to characterize the performance and limitations of these methods. Here we discuss some potential pitfalls connected with the use of hardware-efficient AnsÀtze in variational quantum simulations of electronic structure. We illustrate that hardware-efficient AnsÀtze may break Hamiltonian symmetries and yield nondifferentiable potential energy curves, in addition to the well-known difficulty of optimizing variational parameters. We discuss the interplay between these limitations by carrying out a comparative analysis of hardware-efficient AnsÀtze versus unitary coupled cluster and full configuration interaction, and of second-and first-quantization strategies to encode Fermionic degrees of freedom to qubits.

What does quantum computing have in common with the Oscar-winning movie “Everything Everywhere All at Once”? One is a mind-blowing work of fiction, while the other is an emerging frontier in computer science — but both of them deal with rearrangements of particles in superposition that don’t match our usual view of reality.

Fortunately, theoretical physicist Michio Kaku has provided a guidebook to the real-life frontier, titled “Quantum Supremacy: How the Quantum Computer Revolution Will Change Everything.”

“We’re talking about the next generation of computers that are going to replace digital computers,” Kaku says in the latest episode of the Fiction Science podcast. “Today, for example, we don’t use the abacus anymore in Asia. 
 In the future, we’ll view digital computers like we view the abacus: old-fashioned, obsolete. This is for the garbage can. That’s how the future is going to evolve.”

😗


According to computational complexity theory, mathematical problems have different levels of difficulty in the context of their solvability. While a classical computer can solve some problems ℗ in polynomial time—i.e., the time required for solving P is a polynomial function of the input size—it often fails to solve NP problems that scale exponentially with the problem size and thus cannot be solved in polynomial time. Classical computers based on semiconductor devices are, therefore, inadequate for solving sufficiently large NP problems.

In this regard, quantum computers are considered promising as they can perform a large number of operations in parallel. This, in turn, speeds up the NP problem-solving process. However, many physical implementations are highly sensitive to thermal fluctuations. As a result, quantum computers often demand stringent experimental conditions such extremely low temperatures for their implementation, making their fabrication complicated and expensive.

Fortunately, there is a lesser-known and as-yet underexplored alternative to quantum computing, known as probabilistic computing. Probabilistic computing utilizes what are called “stochastic nanodevices,” whose operations rely on thermal fluctuations, to solve NP problems efficiently. Unlike in the case of quantum computers, thermal fluctuations facilitate problem solving in probabilistic computing. As a result, probabilistic computing is, in fact, easier to implement in real life.