Ellie Gabel discusses how computing at the thinnest scale is driving breakthroughs in quantum technology, ultrathin chips, and atomic-layer devices that could transform the future of computing.

Computing has progressed exponentially in just a few decades. Even as internal components have shrunk, computational power has increased by orders of magnitude.

Still, the world’s most powerful computers remain restricted to bulky form factors and complex optical setups.

New technologies present novel problems. Quantum computers, for instance, are sensitive to even the most minor perturbations. Condensing components could pay off, but the approach poses numerous technical challenges.

What if there were a way to scale electronics down and use far fewer parts without impacting performance?

Researchers have finally answered this burning question and found that the finish line for the race toward atomic-scale computing is in sight.

Once thought impossible, sub-nanoscale chips are in development. If scalable, this technology could revolutionise computing.

Ultrathin chip technology sees a breakthrough

Computing at the thinnest possible scale is about speed, efficiency and performance, not hubris.

Supercomputers are incredibly powerful, but are reminiscent of the bulky personal computers of old — they require highly controlled, difficult-to-scale conditions.

Practical quantum computers are crucial for next-generation computing, and miniaturisation is the key to unlocking them.

Typically, photons are coaxed into quantum states by intricate optical devices like waveguides. Entanglement enables them to encode and process data in parallel.

This interaction is notoriously challenging to scale because any imperfection can degrade computation.

Optics researchers at the Harvard School of Engineering and Applied Sciences made a significant leap toward room-temperature quantum computing by leveraging nanoscale technology.

They developed a novel metasurface — a two-dimensional device etched with nanoscale patterns to control the behaviour of electromagnetic waves.

Replacing the conventional setup with one ultrathin chip eliminates the need for complex, bulky optical components.

Their miniature, error-resistant quantum metasurface can generate entangled photons, making quantum networks more reliable and scalable. This solution is cost-effective, easy to fabricate and does not require intricate alignments.

Semiconductor manufacturing’s miniaturisation trend signals that such breakthroughs will not remain restricted to academia for long.

With feature sizes reaching under 5 nanometers – and subnanometer solutions on the horizon – precision manufacturing will advance rapidly.

The science behind atomic-scale computing

The race toward atomic-scale computing has to start somewhere. Heat dissipation is among the most pressing challenges facing research and development teams. The amount of heat electronics generate increases exponentially as they get smaller.

At nanoscale thickness, a copper wire’s electrical resistance increases rapidly because the electrons are more likely to collide with the wire’s surface, producing more waste heat.

Increasing power capacity to offset performance losses is out of the question – miniaturisation is the whole point. This issue limits the size and efficiency of nanoscale computing technology.

Stanford Engineering researchers developed an innovative solution to this problem – biobium phosphide. This ultrathin material generates electricity better than copper in films that are only a few atoms thick.

While copper becomes worse at conducting electricity when around 50 nanometers thick, it performs well at 5 nanometers, even at room temperature.

Two-dimensional materials are foundational for computing at the thinnest scale. Another research team discovered that atomic-layer devices comprised of tungsten diselenide (WSe2) have an exceptionally strong nonlinear optical response.

They use a few thousand photons, which is far more efficient than fibre optics for long-distance communication.

Fibre-optic networks are fast, but electrical processing generates excessive waste heat and introduces delays. WSe2 uses a small number of photons to process information, improving telecommunication efficiency.

Original equipment manufacturers could apply this breakthrough to quantum computing.

The current state of research and development

Numerous notable research and development milestones exist, with prototypes and discoveries steadily emerging from industry and academia.

Many are focused on quantum computing. The application may be niche, but findings will trickle down, catalysing progress.

Take one recent quantum dots breakthrough, for example. Lawrence Livermore National Laboratory researchers pioneered a novel technique for depositing quantum dots on corrugated surfaces with liquid engineering.

This innovative approach eliminates the need for post-processing, significantly improving device scalability and performance.

Near-infrared photodetectors are fundamental to sensing technologies. Performance takes priority, but a compact form factor is nonnegotiable, especially in cutting-edge defence, biomedical and security systems.

Imaging systems must detect multiple wavelengths of light simultaneously on a single chip. However, depositing quantum dots on a textured surface is difficult.

This novel application technique presents a cost-effective, scalable alternative that could revolutionise the production of medical equipment, communication systems and consumer electronics.

The path toward computing at the thinnest scale

Original equipment manufacturers have yet to apply recent breakthroughs at scale, but they are already looking forward.

They are right to – this industry moves fast. Once they have accomplished subnanoscale production, will they move on to refining atomic-layer devices?

What’s next for computing at the thinnest scale?

Assessing the current state of semiconductor and electronics manufacturing will present a clearer picture of the industry’s future outlook.

The United States controls just 12% of global semiconductor manufacturing capacity. Congress passed the CHIPS Act to incentivise reshoring, but manufacturers remain limited by the rarity of rare-earth element deposits.

As of 2025, China leads the world in electronics miniaturisation. Already, Chinese researchers are utilising molecular beam epitaxy to circumvent the conventional limitations of crystal growth.

This approach provides unparalleled structural control, ensuring perfect alignment and considerably reducing manufacturing defects.

Theoretically, China could use this method to produce up to 50 layers per minute, with a maximum of 15,000 semiconductor layers.

At just a few atoms thick, the ultrathin chips would revolutionise computing. Being the first to market with an efficient mass production method could permanently tip the scales.

Innovation is beneficial regardless of where it occurs. However, today’s actions will shape tomorrow’s technology landscape, influencing supply chains and market competition.

Policymakers should pay attention to material synthesis and device engineering breakthroughs.

Applying pressure to scientists and policymakers

Even though novel manufacturing techniques are largely proof of concept, commercialisation pathways exist.

As feature sizes approach the subnanometer scale, engineers must continue exploring ways to enable sophisticated computing operations. The more efficient their designs, the more energy they can spend on actual computation.

The finish line for the mass production of next-generation atomic-layer devices is fast approaching. Whoever reaches it first will dominate the market.

At this stage, cross-border collaboration between scientists, industry professionals and policymakers is crucial.