Toggle light / dark theme

Non-personalized content and ads are influenced by things like the content you’re currently viewing and your location (ad serving is based on general location). Personalized content and ads can also include things like video recommendations, a customized YouTube homepage, and tailored ads based on past activity, like the videos you watch and the things you search for on YouTube. We also use cookies and data to tailor the experience to be age-appropriate, if relevant.

Select “More options” to see additional information, including details about managing your privacy settings. You can also visit g.co/privacytools at any time.

Data centers are facilities that house the computing hardware used to process and store data. While some businesses maintain their own data centers on site, many others rely on ones owned and operated by someone else.

As our digital world continues to grow, demand for data centers — and clean electricity to operate them — is also increasing. To find out how we’ll be able to keep up, let’s look at the history of data centers, the challenges facing them, and ideas for overcoming those issues — on land, at sea, and in space.

Researchers at the University of Chicago Pritzker School of Molecular Engineering (PME) have made unexpected progress toward developing a new optical memory that can quickly and energy-efficiently store and access computational data. While studying a complex material composed of manganese, bismuth and tellurium (MnBi2Te4), the researchers realized that the material’s magnetic properties changed quickly and easily in response to light. This means that a laser could be used to encode information within the magnetic states of MnBi2Te4.

PEARC24 launched its first Workshop on Broadly Accessible Quantum Computing (QC) as the full conference began, July 22, in Providence, RI. Led by NCSA’s Bruno Abreu and QuEra’s Tomasso Macri, 30+ participants included quantum chemists, system administrators, software developers, research computing facilitators, students and others looking to better understand the current status and the prospects of QC and its applications.

Join Brian Greene and a team of researchers testing Google’s quantum computer to glean new insights about quantum gravity from their impressive–if controversial–results.

Participants:
Maria Spiropúlu.
Joseph Lykken.
Daniel Jafferis.

Moderator:
Brian Greene.

00:00 — Introduction.

Single-photon detectors built from superconducting nanowires have become a vital tool for quantum information processing, while their superior speed and sensitivity have made them an appealing option for low-light imaging applications such as space exploration and biophotonics. However, it has proved difficult to build high-resolution cameras from these devices because the cryogenically cooled detectors must be connected to readout electronics operating at room temperature. Now a research team led by Karl Berggren at the Massachusetts Institute of Technology has demonstrated a superconducting electronics platform that can process the single-photon signals at ultracold temperatures, providing a scalable pathway for building megapixel imaging arrays [1].

The key problem with designing high-resolution cameras based on these superconducting detectors is that each of the sensors requires a dedicated readout wire to record the single-photon signals, which adds complexity and heat load to the cryogenic system. Researchers have explored various multiplexing techniques to reduce the number of connections to individual detectors, yielding imaging arrays in the kilopixel range, but further scaling will likely require a signal-processing solution that can operate at ultralow temperatures.

Berggren and his collaborators believe that the answer lies in devices called nanocryotrons (nTrons), which are three-terminal structures made from superconducting nanowires, just like the single-photon detectors are. Although nTrons do not deliver the same speed and power of superconducting electronics based on Josephson junctions, the researchers argue that these shortcomings are not a critical problem in photon-sensing applications, where the detectors are similarly limited in speed and power. The nTrons also offer several advantages over Josephson junctions: they operate over a wider range of cryogenic temperatures, they don’t require magnetic shielding, and they exploit the same fabrication process as that used for the detectors, allowing for easy on-chip integration.

Researchers at TMOS, the ARC Center of Excellence for Transformative Meta-Optical Systems, and their collaborators at RMIT University have developed a new 2D quantum sensing chip using hexagonal boron nitride (hBN) that can simultaneously detect temperature anomalies and magnetic field in any direction in a new, groundbreaking thin-film format.

Warning: Spoilers for both films, The Matrix and The Thirteenth Floor ahead!

This year is the 25th anniversary of both The Matrix (see my article here) and The Thirteenth Floor (released on May 28, 1999). Both films depict what we now call the simulation hypothesis, the idea that we might live inside a computer simulation. In my college-level class that I teach about the emerging field of simulation (titled Simulation Theory: Sci-Fi, Technology, Religion and Philosophy), while The Matrix is the first sci-fi film I assign to my students, the second one is The Thirteenth Floor. While The Matrix is by far the most recognizable popular media depiction of the simulation idea, on this anniversary I am arguing that The Thirteenth Floor may be a better and richer representation of a number of aspects of the simulation hypothesis than even The Matrix.