Applications
ORBIT's probabilistic architecture addresses three fundamental problem classes across multiple industries.
Combinatorial Optimisation
Quantum Dice's probabilistic implementation solves multiple types of NP-hard problems, finding near-optimal solutions faster than classical hardware across these domains:
Probabilistic computing offers new ways to optimise routing, scheduling, and supply chain networks by running probabilistic algorithms to find near-optimal solutions faster than classical hardware.
It helps optimise delivery routes, fleet management, and scheduling tasks where uncertainty exists in demand, travel times, and customer requests. For example, it can reduce fuel costs and emissions by streamlining delivery routes.
It can efficiently tackle complex vehicle routing problems with multiple objectives like cost, time, and carbon footprint reduction.

Probabilistic computing is well-suited to solve real-time network optimisation problems encountered in telecommunication or utility networks that require real-time optimisation of resources.
It is particularly well suited for complex problems that involve large quantities of data, such as real-time radio resource allocation in next-generation mobile networks, or adjusting the various sources of electricity production in modern grids during times of strongly fluctuating demand and production.

Probabilistic computing is well-suited to tackle the problem of electronic design automation by efficiently exploring vast design spaces through inherent stochasticity, enabling faster identification of near-optimal circuit configurations and layouts.
This approach reduces the computational complexity of circuit optimisation tasks such as placement, routing, and timing analysis, leading to improved design quality and reduced development time.

AI / Machine Learning
Energy-based learning optimises parameters to reduce the free energy of data; dense interactions allow richer energy surfaces and better modelling of complex distributions, aligning with classic energy-based model (EBM) principles.
Our architecture can emulate fully connected Boltzmann machines at scale, which enhances expressive power and sampling efficiency compared to sparse or physically limited topologies in conventional implementations.
Our approach replaces large physical synapse networks or hidden chains with a digital energy calculator, reducing interconnect overheads and routing complexity—a major bottleneck in dense EBMs—thereby improving hardware utilisation and enabling larger instances under fixed physical constraints.

Simulation & Cryptanalysis
Probabilistic computing offers a paradigm shift for large-scale simulations by moving away from the rigid, energy-intensive precision of traditional binary logic. In a standard simulation, every bit must be deterministically computed, requiring significant power and time to resolve complex stochastic variables.
Probabilistic processors (such as those using p-bits) are inherently stochastic and can naturally model real-world systems, allowing them to traverse vast solution spaces—like those in fluid dynamics or molecular modeling—much faster than deterministic hardware.
By leveraging a degree of inherent uncertainty at the hardware level, these systems can identify high-probability outcomes and global minima without getting stuck in local optima. This leads to a massive reduction in the computational overhead required for Monte Carlo methods and uncertainty quantification, effectively trading perfect bit-level accuracy for a rapid, statistically correct overview of highly complex phenomena.

In cryptanalysis, probabilistic computing acts as a specialized hardware accelerator for the algorithms used to break codes. These algorithms often require solving many sub-problems to reconstruct a hidden key.
While traditional computers struggle to find relevant solutions among the sheer number of possibilities in these problems, probabilistic systems excel at navigating these “needle-in-a-haystack” problems.

Next steps