Toggle light / dark theme

New technique could speed up the development of acoustic lenses, impact-resistant films and other futuristic materials

Metamaterials are products of engineering wizardry. They are made from everyday polymers, ceramics, and metals. And when constructed precisely at the microscale, in intricate architectures, these ordinary materials can take on extraordinary properties.

With the help of computer simulations, engineers can play with any combination of microstructures to see how certain materials can transform, for instance, into sound-focusing acoustic lenses or lightweight, bulletproof films.

But simulations can only take a design so far. To know for sure whether a metamaterial will stand up to expectation, physically testing them is a must. But there’s been no reliable way to push and pull on metamaterials at the microscale, and to know how they will respond, without contacting and physically damaging the structures in the process.

Laser pulse compression by a density gradient plasma for exawatt to zettawatt lasers

A new method of creating laser pulses, more than 1,000 times as powerful as those currently in existence, has been proposed by scientists in the UK and South Korea.

The scientists have used in joint research to demonstrate a new way of compressing light to increase its intensity sufficiently to extract particles from vacuum and study the nature of matter. To achieve this the three groups have come together to produce a very special type of mirror—one that not only reflects pulses of light but compresses them in time by a factor of more than two hundred times, with further compression possible.

The groups from the University of Strathclyde, UNIST and GIST propose a simple idea—to use the gradient in the density of plasma, which is fully ionized matter, to cause photons to “bunch,” analogous to the way a stretched-out group of cars bunch up as they encounter a steep hill. This could revolutionize the next generation of lasers to enable their powers to increase by more than one million times from what is achievable now.

Nanowire Network Mimics Brain, Learns Handwriting with 93.4% Accuracy

Summary: Researchers developed an experimental computing system, resembling a biological brain, that successfully identified handwritten numbers with a 93.4% accuracy rate.

This breakthrough was achieved using a novel training algorithm providing continuous real-time feedback, outperforming traditional batch data processing methods which yielded 91.4% accuracy.

The system’s design features a self-organizing network of nanowires on electrodes, with memory and processing capabilities interwoven, unlike conventional computers with separate modules.

First 2D semiconductor with 1,000 transistors developed: Redefining energy efficiency in data processing

As information and communication technologies (ICT) process data, they convert electricity into heat. Already today, the global ICT ecosystem’s CO2 footprint rivals that of aviation. It turns out, however, that a big part of the energy consumed by computer processors doesn’t go into performing calculations. Instead, the bulk of the energy used to process data is spent shuttling bytes between the memory to the processor.

In a paper published in the journal Nature Electronics, researchers from EPFL’s School of Engineering in the Laboratory of Nanoscale Electronics and Structures (LANES) present a new processor that tackles this inefficiency by integrating data processing and storage onto a single device, a so-called in-memory processor.

They broke new ground by creating the first in-memory processor based on a two-dimensional to comprise more than 1,000 transistors, a key milestone on the path to industrial production.

Swiss researchers develop first large-scale in-memory processor

Made using molybdenum disulfide, the processor has over 1,000 transistors but works in two dimensions.


2023 EPFL / Alan Herzog.

Modern-day information technology systems are well known for producing large amounts of heat. Heat reduction is a more efficient way of using energy and will also help the world reduce carbon emissions, as it aims to go greener in the coming few decades. To minimize this unwanted heat, one must go to the root of the problem, the von Neumann architecture.

Unlocking the secrets of spin with high-harmonic probes

Deep within every piece of magnetic material, electrons dance to the invisible tune of quantum mechanics. Their spins, akin to tiny atomic tops, dictate the magnetic behavior of the material they inhabit. This microscopic ballet is the cornerstone of magnetic phenomena, and it’s these spins that a team of JILA researchers—headed by JILA Fellows and University of Colorado Boulder professors Margaret Murnane and Henry Kapteyn—has learned to control with remarkable precision, potentially redefining the future of electronics and data storage.

In a Science Advances publication, the JILA team—along with collaborators from universities in Sweden, Greece, and Germany—probed the spin dynamics within a special material known as a Heusler compound: a mixture of metals that behaves like a single magnetic material.

For this study, the researchers utilized a compound of cobalt, manganese, and gallium, which behaved as a conductor for electrons whose spins were aligned upwards and as an insulator for electrons whose spins were aligned downwards.

Adaptively partitioned analog quantum simulation on near-term quantum computers: The nonclassical free-induction decay of NV centers in diamond

The idea of simulating quantum physics with controllable quantum devices had been proposed several decades ago. With the extensive development of quantum technology, large-scale simulation, such as the analog quantum simulation tailoring an artificial Hamiltonian mimicking the system of interest, has been implemented on elaborate quantum experimental platforms. However, due to the limitations caused by the significant noises and the connectivity, analog simulation is generically infeasible on near-term quantum computing platforms. Here we propose an alternative analog simulation approach on near-term quantum devices. Our approach circumvents the limitations by adaptively partitioning the bath into several groups based on the performance of the quantum devices.

Quantum computer emulated by a classical system

Year 2015 face_with_colon_three


(Phys.org)—Quantum computers are inherently different from their classical counterparts because they involve quantum phenomena, such as superposition and entanglement, which do not exist in classical digital computers. But in a new paper, physicists have shown that a classical analog computer can be used to emulate a quantum computer, along with quantum superposition and entanglement, with the result that the fully classical system behaves like a true quantum computer.

Physicist Brian La Cour and electrical engineer Granville Ott at Applied Research Laboratories, The University of Texas at Austin (ARL: UT), have published a paper on the classical emulation of a quantum computer in a recent issue of The New Journal of Physics. Besides having fundamental interest, using classical systems to emulate quantum computers could have practical advantages, since such quantum emulation devices would be easier to build and more robust to decoherence compared with true quantum computers.

“We hope that this work removes some of the mystery and ‘weirdness’ associated with quantum computing by providing a concrete, classical analog,” La Cour told Phys.org. “The insights gained should help develop exciting new technology in both classical analog computing and true quantum computing.”