Superpositions Studio

    HQNN
    Hybrid Quantum Neural Network

    Combine quantum layers with classical deep learning to enhance feature expressivity — transparent, reproducible, and benchmarked on Superpositions Studio.

    Reproducible hybrid pipelines
    Downloadable code & benchmarks
    Classical baselines included
    Transparent metrics & results

    Overview

    A Hybrid Quantum Neural Network (HQNN) integrates parameterized quantum circuits as layers within a classical neural network. Classical components (e.g., CNN/MLP) handle robust feature extraction, while quantum layers provide additional feature transformations that may be hard to emulate classically.

    Why It Matters

    Hybrid models exploit mature classical training infrastructure while exploring potential quantum advantages in representation learning, especially for small and structured datasets.

    How HQNN Works

    A four-step process to build hybrid quantum-classical neural networks

    01

    Build a classical backbone

    Build a classical backbone (e.g., CNN or MLP) for initial features.

    02

    Insert a quantum layer

    Insert a quantum layer (PQC) that transforms features via a feature map and trainable gates.

    03

    Optimize end-to-end

    Optimize end-to-end with stochastic optimizers; backprop passes through parameter-shift or gradient-free estimators.

    04

    Evaluate and ablate

    Evaluate calibration, robustness, and ablate the quantum layer's contribution.

    Real-World Applications

    Where quantum kernels provide tangible advantages

    Efficiency

    Small/medium datasets

    Small/medium datasets with nonlinear decision boundaries

    Advanced

    Signals & anomalies

    Signals & anomalies (spectra, time series, scientific data)

    Innovation

    Research on quantum-enhanced features

    Research on quantum-enhanced feature maps/kernels

    Research

    Hybrid pipelines

    Build end-to-end hybrid pipelines that blend PyTorch modules and quantum layers

    Strengths & Limitations

    Strengths

    • Leverages classical strengths while exploring quantum features
    • Modular and extensible in standard ML frameworks
    • End-to-end training and ablations

    Limitations

    • No guaranteed advantage; benefits are task-dependent
    • Training noise and shot cost can be high
    • Careful engineering needed to avoid barren plateaus

    Benchmarking and Verification

    Benchmark against classical backbones without quantum layers; report deltas in accuracy, calibration, robustness, and compute cost. Seed-controlled pipelines and saved checkpoints ensure reproducibility.

    Hardware & Requirements

    QubitsDimension of quantum layer input
    DepthShallow-to-moderate; deeper increases noise and training difficulty
    Shots1k–50k per circuit run depending on budget
    BackendStart and prototype on a simulator, then run on a quantum device

    Proof-of-Concept Example

    Real experimental results demonstrating HQNN performance

    Task

    Time Series Prediction of wind energy

    Optimizer

    Adam and parameter-shift for quantum part

    MetricsAccuracy of the predictions
    OutcomeShow stable training on simulator on GPU; quantum device demo feasible

    FAQ

    Common questions about HQNN implementation and performance

    Ready to Run HQNN?

    Run HQNN on Superpositions Studio — plug quantum layers into your DL models and export reproducible experiments.

    Powered by Superpositions Studio — Transparent, reproducible quantum computing