The Potential of Quantum Processing Units in Neural Network Construction

Page 1


The Potential of Quantum Processing Units in Neural

Network Construction

The Potential of Quantum Processing Units in Neural Network Construction A Paper by

1. Introduction: The Convergence of Quantum Computing and Neural Networks

The field of artificial intelligence has witnessed remarkable advancements in recent years, largely driven by the success of classical machine learning, particularly deep learning. These techniques have found widespread applications across diverse domains, from image recognition and natural language processing to complex decision-making systems.1 Simultaneously, quantum computing has emerged as a revolutionary paradigm, leveraging the principles of quantum mechanics to perform computations that are intractable for even the most powerful classical supercomputers.12 This innovative approach hinges on phenomena such as superposition and entanglement, offering the potential for exponential speedups for specific types of computational problems.15 The intersection of these two transformative fields, known as quantum machine learning (QML), seeks to harness the unique capabilities of quantum computing to enhance and accelerate machine learning algorithms, including the construction and training of neural networks. 24 The increasing complexity of machine learning tasks, particularly those involving massive datasets and intricate models, often pushes the boundaries of classical computational resources. Quantum computing offers a potential pathway to overcome these limitations for certain classes of problems, leading to the growing interest in exploring the synergy between these two disciplines. 27

This report aims to explore the potential role of quantum processing units (QPUs), the core computational component of quantum computers, in the construction of neural networks. It will delve into the fundamental principles that underpin both QPUs and neural networks, providing a comprehensive understanding of their individual capabilities.1 Subsequently, the report will analyse the various ways in which QPUs can be employed in the context of neural network construction, encompassing aspects such as network architecture design, training methodologies, and potential applications.25 The current state of research in this rapidly evolving field will be discussed, along with an examination of the potential advantages that QPUs could bring to neural networks, as well as the existing challenges and limitations that need to be addressed.24 By providing a holistic view of the intricate relationship between QPUs and neural networks, this report seeks to move beyond general discussions and explore the underlying mechanisms and implications of this exciting interdisciplinary area.

2. Quantum Processing Units (QPUs): A Primer

A quantum processing unit (QPU) serves as the central computational component within a quantum computer, functioning similarly to the central processing unit (CPU) in classical computing systems.13 However, unlike CPUs that manipulate bits representing either 0 or 1, QPUs harness the principles of quantum mechanics to process information using quantum bits, or qubits. 12 A key distinction lies in the ability of qubits to exist in a state of superposition, meaning they can represent a

combination of both 0 and 1 simultaneously. 13 This is analogous to a coin spinning in the air, being neither heads nor tails until it lands. Furthermore, QPUs leverage the phenomenon of entanglement, where two or more qubits become interconnected in such a way that their quantum states are correlated, regardless of the physical distance separating them.13 This interconnectedness allows for instantaneous correlations between the entangled qubits. By exploiting superposition and entanglement, QPUs can perform complex calculations by simultaneously exploring multiple possibilities, a feat that is fundamentally different from the sequential processing of classical computers.15

While both QPUs and classical computing units like CPUs and graphics processing units (GPUs) are processing hardware, they operate on distinct principles. Central processing units rely on classical physics and binary bits, whereas QPUs utilize the principles of quantum mechanics.16 GPUs, while also capable of parallel processing, are based on classical physics and electrical currents, unlike QPUs which manipulate the quantum states of particles.16 It is important to note that QPUs are not intended as a direct replacement for CPUs or GPUs. Instead, they are designed to excel at specific types of computationally intensive problems that are particularly well-suited to quantum algorithms.13 In fact, for many everyday computational tasks, CPUs operate much faster than current QPUs.14 However, for certain complex problems, QPUs offer the potential for significantly greater efficiency, potentially leading to faster computation times overall. 14 Interestingly, given the current limitations of quantum hardware, GPUs are sometimes used in classical systems to simulate the behaviour of QPUs, aiding in the development and testing of quantum algorithms.16 This highlights the fundamentally different nature of quantum computation compared to classical approaches.

The realization of QPUs involves various underlying qubit technologies, each with its own unique characteristics and technological requirements. 14 Superconducting qubits, employed by companies like IBM and Google, are fabricated using tiny superconducting circuits that are cooled to temperatures near absolute zero to exhibit quantum properties.13 Trapped ion qubits, utilized by IonQ and Honeywell, involve using electromagnetic fields to hold individual ions in place, with their quantum states controlled by lasers.14 Photonic qubits, explored by companies like Xanadu, use light particles (photons) as qubits and have the advantage of potentially operating at room temperature.14 Other modalities, such as neutral atoms and quantum dots, are also under active research and development. 14 Each of these qubit technologies presents its own set of advantages in terms of scalability, stability, and coherence times (the duration for which a qubit can maintain its quantum state), as well as challenges related to control mechanisms and error rates. 14 The manufacturing of quantum chips, which are the core component of QPUs, is a highly specialized and intricate process that demands precision engineering, advanced materials science, and a deep understanding of quantum physics. 15 The ongoing research and development efforts across these different qubit technologies underscore the evolving nature of the field and the critical role of hardware advancements in unlocking the full potential of quantum computing.

3. Neural Networks: Core Concepts and Architectures

Neural networks, also known as artificial neural networks (ANNs), are a class of machine learning models inspired by the complex structure and functioning of

biological neural networks found in the human brain. 1 These computational models are designed to recognize patterns in data by mimicking the way biological neurons work together to process information, weigh options, and arrive at conclusions. 3 The fundamental building blocks of neural networks are artificial neurons, or nodes, which are interconnected in layers.2 A typical neural network consists of an input layer that receives data, one or more hidden layers that process the information, and an output layer that produces the final result. 2 Each connection between two neurons has an associated weight, which determines the strength or influence of one neuron on another.2 When data is fed into the network, each neuron receives inputs from the neurons in the preceding layer, multiplies these inputs by their respective weights, and then sums the weighted inputs.5 This sum is then passed through an activation function, which introduces non-linearity into the network and determines the output of the neuron.5 If the output of a neuron exceeds a certain threshold, it becomes activated and sends data to the next layer of the network; otherwise, no data is passed along.3 This layered architecture and the interplay of weights and activation functions enable neural networks to learn intricate patterns and relationships from data.4 By adjusting the weights during a process called training, the network can adapt its response to different inputs and improve its performance on a specific task. 3

The field of neural networks encompasses a variety of architectures, each designed to address different types of data and learning tasks. 2 Feedforward neural networks are characterized by a unidirectional flow of information, moving from the input layer through the hidden layers to the output layer without any loops or cycles. 2 These networks are commonly used for tasks such as classification and regression that involve sequential data processing.6 Convolutional neural networks (CNNs) are particularly well-suited for processing grid-like data, such as images and videos. 2 CNNs utilize specialized layers called convolutional layers that apply filters to the input data to automatically learn hierarchical features, making them highly effective for tasks like image recognition, object detection, and image segmentation. 2 Recurrent neural networks (RNNs) are designed to handle sequential data, such as time series and natural language.6 Unlike feedforward networks, RNNs incorporate feedback loops, allowing them to maintain a hidden state that acts as a memory of past inputs, making them suitable for tasks like language modelling, speech recognition, and sentiment analysis.6 Deep neural networks are characterized by having multiple hidden layers, often with millions of interconnected artificial neurons. 2 This depth allows deep neural networks to learn more complex and abstract representations from data, enabling them to tackle highly challenging tasks. 2 The choice of neural network architecture depends heavily on the nature of the problem being addressed and the characteristics of the data being used, with each type offering unique strengths and weaknesses for different applications. 2

The process of learning in classical neural networks involves iteratively adjusting the weights and biases of the connections between neurons based on the training data. 3 The most common learning paradigm is supervised learning, where the network is trained using labelled data consisting of input-output pairs. 5 The goal of supervised learning is for the network to learn a mapping function that can accurately predict the output for new, unseen inputs.5 This is typically achieved through an optimization process that aims to minimize the difference between the network's predictions and the actual target outputs in the training data. 4 A widely used algorithm for this optimization is backpropagation, which calculates the gradient of the error (the

difference between the predicted and actual outputs) with respect to the network's weights and then uses this gradient to update the weights in a direction that reduces the error.2 This process is repeated iteratively over the training data until the network reaches an acceptable level of performance. 5 In contrast to supervised learning, unsupervised learning involves training neural networks on unlabelled data, where the network's objective is to discover underlying patterns, structures, or relationships in the data without explicit guidance.5 Techniques like clustering and dimensionality reduction fall under this category.5 Another learning paradigm is reinforcement learning, where an agent (often a neural network) learns to make decisions by interacting with an environment and receiving feedback in the form of rewards or penalties.5 The agent's goal is to learn a policy that maximizes the cumulative reward over time.5 Regardless of the learning paradigm, the iterative adjustment of network parameters based on data is fundamental to the ability of neural networks to learn and improve their performance on various tasks. 5

4. The Emergence of Quantum Neural Networks: Bridging the Gap

The increasing computational demands of modern machine learning, particularly with the rise of large-scale and complex datasets, have motivated researchers to explore alternative computational paradigms beyond classical computing. 27 Classical neural networks, while incredibly powerful, can face limitations in terms of training time and resource consumption when dealing with such data. 27 Quantum computing, with its potential for exponential speedups for certain types of computations, offers a promising avenue to address some of these limitations. 24 This has led to the emergence of quantum neural networks (QNNs), a field that aims to integrate the pattern recognition capabilities of classical neural networks with the computational advantages offered by quantum information processing. 26 The core motivation behind this exploration is the potential to achieve significant computational speedups and enhanced capabilities for machine learning tasks that are currently intractable or highly inefficient for classical computers.27 Researchers anticipate that QNNs could pave the way for breakthroughs in various computationally intensive fields, such as drug discovery, materials science, and advanced artificial intelligence, by enabling faster and more accurate analysis of complex datasets. 24

The theoretical foundation for integrating quantum computing with neural network concepts lies in the ability to represent and manipulate data using the principles of quantum mechanics.36 In QNNs, classical data is often encoded into quantum states, typically using qubits.36 These quantum states are then processed using sequences of quantum gates, which are analogous to the operations performed by neurons in classical neural networks.36 Quantum circuits, which are composed of these parameterized quantum gates, can be viewed as a quantum analogue of neural network architectures.25 The unique quantum phenomena of superposition, entanglement, and interference can be harnessed within these quantum circuits to perform computations in ways that have no direct classical counterparts. 14 For instance, superposition allows a quantum circuit to explore multiple computational paths simultaneously, while entanglement can enable efficient processing of correlated data.14 Researchers are actively investigating how the layers and operations of classical neural networks can be effectively mapped onto quantum circuits, with qubits representing neurons and quantum gates representing the computational steps.37 This translation process requires careful consideration of how

fundamental classical concepts, such as non-linear activation functions and weight updates, can be implemented within the framework of quantum computing, which is inherently based on linear and unitary operations. 37

Quantum processing units offer the potential to address several key limitations of classical neural networks, particularly in areas requiring significant computational resources. One of the most promising aspects is the potential for QPUs to accelerate the training of neural networks. By leveraging quantum parallelism, QPUs can explore a vast number of network configurations and parameter settings concurrently, potentially reducing the time needed to find optimal or near-optimal solutions.24 Furthermore, specific quantum algorithms have been developed that could provide speedups for computational tasks commonly encountered in machine learning. For example, Grover's algorithm offers a quadratic speedup for searching through large, unstructured datasets, which could be beneficial in data retrieval and classification tasks within neural network workflows. 27 Quantum Support Vector Machines (QSVMs) are another area of active research, with the potential to handle large, high-dimensional datasets more efficiently than their classical counterparts, making them particularly relevant for complex classification problems. 27 Similarly, Quantum Principal Component Analysis (QPCA) offers the possibility of faster dimensionality reduction for large datasets, which is a crucial preprocessing step in many machine learning pipelines.27 The ability of QPUs to inherently operate on quantum states also opens up possibilities for processing and learning from quantum data, which is increasingly relevant in fields like quantum chemistry and materials science.13 Overall, QPUs present a unique set of computational capabilities that could potentially overcome existing bottlenecks in classical neural network construction, training, and application, particularly for computationally demanding tasks and those involving quantum data.

5. Exploring the Landscape of Quantum Neural Network Architectures

The development of quantum neural network architectures is a dynamic and rapidly evolving area of research, with numerous proposals exploring different ways to integrate quantum computing principles with neural network concepts. 25 One fundamental approach involves the concept of quantum perceptrons, which aim to define a quantum analogue for the basic perceptron unit found in classical neural networks.37 These proposals often explore how to mimic the non-linear activation functions of classical perceptrons within the constraints of quantum mechanics, which typically involves linear operations and probabilistic measurements. 37 Another class of architectures includes feedforward QNNs, which, similar to their classical counterparts, are structured in layers where information flows in one direction from an input layer of qubits to subsequent layers. 37 These layers of qubits perform quantum computations, and the output is eventually obtained from the final layer. The number of qubits in each layer can vary, allowing for the construction of networks with different widths. Training these feedforward QNNs often involves adapting classical training techniques to the quantum domain. 37

Inspired by the success of convolutional neural networks in classical machine learning for tasks involving grid-like data, researchers have also proposed Quantum Convolutional Neural Networks (QCNNs).34 These architectures typically consist of a sequence of quantum convolutional layers and pooling layers. The convolutional layers apply quantum filters to the input data to extract spatial features, while the

pooling layers reduce the dimensionality of the quantum state. 39 QCNNs leverage the principles of superposition and entanglement to potentially achieve enhanced feature extraction and classification performance for both quantum and classical data. 39 Hybrid quantum-classical networks represent another significant direction in QNN research.29 These architectures combine the use of quantum circuits for specific computational tasks with classical processing steps for data encoding, parameter optimization, and result interpretation. For instance, a hybrid approach might involve using a parameterized quantum circuit to perform a non-linear transformation of the input data, followed by a classical neural network for the final classification or regression task.47 The parameters of the quantum circuit are then typically optimized using classical optimization algorithms in a feedback loop. The variety of proposed QNN architectures reflects the ongoing effort to effectively harness the unique capabilities of quantum computing for different types of neural network functionalities, drawing inspiration from classical designs while adapting them to the quantum realm.

The translation of classical neural network layers and operations into quantum circuits involves several key steps. First, classical data needs to be represented in a quantum format, which is typically achieved through a process called quantum encoding.36 Common encoding strategies include amplitude encoding, where the amplitudes of the quantum state represent the data, and angle encoding, where the data is encoded in the rotation angles of quantum gates. 36 Once the data is encoded into a quantum state (a collection of qubits), quantum gates are applied to manipulate these qubits.17 Single-qubit gates can perform rotations of individual qubits, while two-qubit gates can create entanglement between pairs of qubits. 17 These quantum gates perform the computational steps within the quantum circuit, analogous to the operations performed by neurons in a classical neural network. After the quantum computation is performed, the state of the qubits is measured to obtain the output.36 The measurement process collapses the quantum state into a classical bit string, which can then be interpreted as a prediction or a decision, depending on the specific task.36 A significant challenge in this translation process is the implementation of non-linear activation functions, which are essential for the expressivity of classical neural networks.37 Quantum evolution, governed by the Schrödinger equation, is inherently linear, making it difficult to directly implement non-linearities. Researchers are exploring various approaches to address this challenge, including using specific measurement techniques, employing non-linear quantum operators (though this is a disputed framework), or approximating nonlinearities through sequences of linear quantum gates. 37 The translation of classical neural network concepts into the quantum domain thus requires careful consideration of data representation, computational steps, and the introduction of non-linearities within the constraints of quantum mechanics.

6. Potential Advantages of Leveraging QPUs for Neural Networks

Leveraging quantum processing units for the construction and training of neural networks holds the potential for several significant advantages, both theoretically and practically. One of the most anticipated benefits is the possibility of achieving exponential speedups for certain machine learning algorithms compared to what is achievable with classical computers.15 This speedup stems from the inherent parallelism offered by quantum computing, where QPUs can explore a vast

computational space simultaneously due to the principles of superposition and entanglement.15 This parallel processing capability could significantly accelerate the training of complex neural networks, particularly those involved in solving intricate optimization problems to find the best model parameters. 24 In some instances, quantum neural networks might also require less training data to achieve a level of performance comparable to or even better than classical neural networks, potentially leading to more efficient learning processes, especially in data-scarce environments.44 Furthermore, the ability of qubits to exist in multiple states simultaneously can lead to an enhanced representational capacity in QNNs, allowing them to model more complex relationships and patterns within data compared to classical networks with a similar number of parameters. 31 This enhanced capacity could be particularly beneficial for tasks involving high-dimensional data and intricate dependencies.

Several specific quantum algorithms have been identified that could offer significant benefits for tasks commonly performed in neural networks. The Quantum Fourier Transform (QFT) is a quantum analogue of the classical Discrete Fourier Transform and can be implemented exponentially faster than its classical counterpart, the Fast Fourier Transform (FFT).43 Since FFT is a fundamental component of some neural network architectures, such as convolutional neural networks, QFT could potentially lead to speedups in these areas.43 Grover's algorithm, known for its quadratic speedup in searching unsorted databases, could be applied to various searchrelated tasks within machine learning algorithms. 27 The Variational Quantum Eigensolver (VQE) is another promising algorithm that can be used to find the eigenvalues of a matrix, a task that is central to many machine learning algorithms, including those based on spectral methods and dimensionality reduction techniques.29 The Quantum Approximate Optimization Algorithm (QAOA) is designed to find approximate solutions to combinatorial optimization problems, which are prevalent in machine learning, such as finding optimal network architectures or training parameters.29 By leveraging these and other quantum algorithms, QPUs could potentially provide substantial computational advantages for various aspects of neural network construction, training, and application, paving the way for solving problems that are currently beyond the reach of classical computing.

7. Challenges and Limitations in the Development and Training of QNNs

Despite the promising potential of leveraging QPUs for neural networks, the field faces several significant challenges and limitations that need to be addressed before these advantages can be fully realized. One of the most prominent limitations is the current state of quantum hardware.55 Existing quantum computers, often referred to as noisy intermediate-scale quantum (NISQ) computers, have a relatively small number of qubits, typically ranging from tens to a few hundred. 15 This limited qubit count restricts the size and complexity of quantum neural networks that can be implemented and trained.15 Furthermore, qubits are inherently fragile and highly susceptible to environmental noise, leading to a phenomenon called decoherence, where they lose their quantum properties over time.18 This limits the duration of quantum computations that can be performed before errors become significant. 18

Another major hurdle is the challenge of quantum error correction. 22 Unlike classical bits, qubits cannot be simply copied due to the no-cloning theorem, making error correction significantly more complex.37 Implementing effective quantum error

correction requires encoding quantum information in a redundant manner using multiple physical qubits to represent a single logical qubit, which adds to the hardware overhead.22 A significant algorithmic challenge in training QNNs is the "barren plateau" problem.32 This phenomenon occurs in many classes of QNNs, where the gradients of the loss function with respect to the network's parameters become exponentially small as the number of qubits increases. 32 This makes it extremely difficult to train large and deep QNNs using gradient-based optimization methods, as the parameter updates become negligible. 32

Efficiently encoding classical data into quantum states is another non-trivial challenge.36 For high-dimensional data, such as images or complex datasets, the number of qubits required for encoding can be substantial, potentially negating some of the advantages of using a quantum computer. 36 Furthermore, the process of measuring the output of a quantum computation is probabilistic. 13 To obtain a reliable result, quantum algorithms often need to be run multiple times, and the outcomes need to be averaged, which adds to the overall computational cost. 13 Finally, implementing non-linear activation functions, which are crucial for the expressivity and learning capabilities of classical neural networks, is challenging in the context of quantum computing due to the inherent linearity of quantum evolution. 37 Researchers are exploring various ways to approximate or circumvent this limitation, but a direct and efficient quantum analogue of classical non-linear activation functions remains an open problem.37 These challenges underscore the significant gap between the theoretical potential of QNNs and the current practical capabilities of quantum hardware and algorithms.

Limited Number of Qubits

Current quantum computers have a small number of qubits, restricting the size and complexity of QNNs.

Qubits are sensitive to noise, leading to a loss of quantum properties over time, limiting computation duration.

Error Correction Correcting errors in quantum computations is complex and requires additional resources.

Barren Plateaus Gradients in the loss landscape can vanish for large QNNs, hindering training. 32 Data Encoding Efficiently mapping classical data to quantum

Table 1: Challenges in Quantum Neural Network Training

Measurement Limitations

Non-linear Activation Functions

states can be challenging.

Quantum measurements are probabilistic and require multiple repetitions.

Implementing the equivalent of non-linearities from classical neural networks in quantum systems is difficult.

8. The Current State of Research and Practical Applications

13

37

The field of quantum machine learning, including the development and application of quantum neural networks, is currently a vibrant and rapidly expanding area of research.18 There is a growing interest from academia, industry, and government institutions in exploring the potential of combining quantum computing with machine learning techniques to solve complex problems. 18 Researchers are actively investigating a wide range of QNN architectures, innovative training methodologies, and potential applications across various scientific and industrial domains. 25 Given the current limitations of quantum hardware, hybrid quantum-classical approaches, which combine the strengths of both quantum and classical computing, are considered the most practical way to leverage near-term quantum devices for machine learning tasks.29 While the field is characterized by a significant amount of theoretical exploration and promising proposals, experimental validation on realworld problems using current quantum hardware remains limited. 16 This indicates that while the potential is substantial, the field is still in its early stages of development, and significant advancements in both quantum hardware and algorithms are necessary to fully realize the envisioned benefits. 16

Despite the current limitations, researchers are exploring a range of potential nearterm practical applications for QNNs across various sectors. In the domain of drug discovery and pharmaceutical development, QNNs could potentially accelerate the process of identifying new drug candidates by enabling more efficient simulation of molecular interactions and analysis of complex biological data. 13 In the financial industry, QNNs are being investigated for applications in financial modeling, risk analysis, and fraud detection, where the ability to process large datasets and identify subtle patterns could offer a significant advantage. 15 Cryptography and cybersecurity represent another area where QNNs could play a crucial role, both in developing new quantum-resistant encryption techniques and in potentially breaking existing classical encryption methods.15 In materials science and chemistry, QNNs could aid in the discovery and design of novel materials with desired properties by enabling more accurate quantum simulations.15 Optimization problems, which arise in various fields such as logistics, energy, and finance, are also being targeted as potential applications for QNNs, leveraging their ability to explore multiple solutions simultaneously.15 Furthermore, QNNs are being explored for fundamental artificial intelligence and machine learning tasks, including image and speech recognition, natural language processing, and anomaly detection, with the hope of achieving improved accuracy and efficiency compared to classical methods. 5 While these potential applications are diverse and promising, the practical realization of QNNs in

these domains is still largely in the research and development phase, awaiting further advancements in both quantum hardware and algorithms. 16

The current capabilities and limitations of available quantum hardware significantly influence the types of QML algorithms, including QNNs, that can be effectively explored and the complexity of problems that can be addressed. 29 As mentioned earlier, current NISQ computers have a limited number of qubits and are susceptible to noise, which restricts the implementation of deep and complex QNN architectures.4 This has led researchers to focus on developing quantum algorithms that are suitable for these near-term devices, such as variational quantum algorithms and hybrid quantum-classical approaches, which typically involve shallower quantum circuits and leverage classical optimization techniques. 29 The development of faulttolerant quantum computers with a significantly larger number of high-quality, stable qubits is considered essential to fully unlock the potential of QNNs for solving complex, real-world machine learning problems that currently surpass the capabilities of classical systems.29 Progress in quantum hardware development, including increasing qubit count, improving coherence times, and reducing error rates, is therefore a critical prerequisite for the widespread adoption and practical application of QNNs in various fields.29

9. Training Methodologies for Quantum Neural Networks

Training quantum neural networks is a crucial aspect of their development and application, and researchers are exploring various methodologies to effectively adjust the parameters of these quantum models.25 Similar to the training of classical neural networks, the training of QNNs often involves defining a loss function that quantifies the difference between the network's output and the desired target output for a given set of training data.25 The goal of training is to find the set of parameters for the quantum circuit that minimizes this loss function. 25 One common approach involves using gradient-based optimization methods, such as gradient descent, to iteratively update the parameters of the quantum circuit based on the gradient of the loss function.25 Calculating these gradients for quantum circuits can be done using techniques like the parameter shift rule.38

Given the limitations of current quantum hardware, hybrid quantum-classical training techniques have emerged as a prominent strategy. 29 In these approaches, a quantum computer is used to perform the forward pass of the QNN, evaluating the loss function and potentially estimating its gradients, while a classical optimizer running on a classical computer updates the parameters of the quantum circuit based on this information.29 This allows researchers to leverage the quantum computer for the parts of the computation where it might offer an advantage, while relying on classical optimizers for parameter updates.29 To potentially speed up the training process, techniques like quantum data parallelism have been explored, where multiple data points are encoded and processed simultaneously on the quantum computer. 25 Furthermore, more advanced training methodologies are being investigated, such as using meta-learning approaches where classical neural networks, like recurrent neural networks, are employed to learn how to optimize the parameters of variational quantum algorithms, potentially leading to more efficient training. 85

Training quantum neural networks presents several unique challenges that are not typically encountered in classical machine learning. As previously discussed, the

barren plateau problem, where gradients vanish for large QNNs, poses a significant obstacle for gradient-based training methods. 32 The inherent noise present in current quantum devices can also affect the accuracy of gradient estimations and hinder the convergence of the training process.30 The choice of the quantum circuit architecture, often referred to as the ansatz, and the strategy used to encode classical data into quantum states can significantly impact the trainability and the ultimate performance of the QNN.25 Developing effective loss functions that are well-suited for quantum computations and can guide the training process efficiently is also an ongoing area of research.25 Moreover, the absence of a direct quantum equivalent of the backpropagation algorithm, which is a cornerstone of training deep classical neural networks, presents a challenge for developing deep QNNs. 54 Overcoming these training-specific challenges is crucial for making QNNs a viable and practical approach for solving real-world problems using quantum computers.

10. Conclusion: The Future Potential of QPUs in Neural Network Construction

In summary, quantum processing units hold considerable promise for enhancing certain aspects of neural network construction and training by leveraging the unique principles of quantum mechanics.15 Quantum neural networks represent a burgeoning field of research that aims to overcome some of the inherent limitations of classical neural networks, particularly for computationally intensive tasks and those involving quantum data.24 While various QNN architectures and training methodologies have been proposed and explored, the field is still grappling with significant challenges, primarily stemming from the current limitations of quantum hardware, including the limited number of qubits, short coherence times, and the presence of noise. 15 Issues such as the barren plateau problem and the complexities of data encoding and measurement also present substantial hurdles. 32 Currently, hybrid quantum-classical approaches appear to be the most feasible strategy for harnessing the potential benefits of QPUs for machine learning tasks within the constraints of near-term quantum devices.29 The development of fault-tolerant quantum computers with a significantly larger number of high-quality qubits will be a critical milestone for unlocking the full potential of QNNs and achieving a clear quantum advantage over classical methods for a broader range of machine learning problems. 29

Looking ahead, research and development in this field are likely to focus on several key areas. One important direction will be the design of more robust and scalable QNN architectures that are specifically tailored to the capabilities and limitations of near-term quantum hardware.36 Continued advancements in quantum error correction techniques will be absolutely essential for building larger and more reliable quantum computers capable of running complex QNN algorithms. 22 A significant area of focus will also be the exploration of novel training methodologies that can effectively mitigate the barren plateau problem and are more resilient to the noise inherent in current quantum systems.32 Identifying specific real-world applications where QNNs can demonstrably outperform classical methods, showcasing a clear quantum advantage, will be crucial for driving the field forward and justifying further investment and research.24 Finally, the development of more user-friendly and efficient quantum programming tools and software libraries will be vital for enabling a broader community of researchers and practitioners to explore and implement QNNs for various applications.38

In conclusion, if the current challenges can be effectively addressed through sustained research and technological advancements, quantum neural networks hold the potential to revolutionize the field of artificial intelligence. 24 This could lead to breakthroughs in areas such as drug discovery, materials science, finance, and optimization by enabling faster training, the ability to handle larger and more complex datasets, and the solution of problems that are currently intractable for classical computing systems.24 However, it is important to maintain a realistic perspective, recognizing that quantum computing is still a nascent technology, and the timeline for the widespread practical application of QNNs remains uncertain. 16 The future of QNNs is undoubtedly promising, but its realization hinges on continued interdisciplinary collaboration and significant progress in both quantum hardware and software development.

Works cited

1. aws.amazon.com, accessed on April 23, 2025, https://aws.amazon.com/whatis/neural-network/#:~:text=A%20neural%20network%20is%20a,that %20resembles%20the%20human%20brain.

2. What is a Neural Network? - AWS, accessed on April 23, 2025, https://aws.amazon.com/what-is/neural-network/

3. What is a Neural Network? | IBM, accessed on April 23, 2025, https://www.ibm.com/think/topics/neural-networks

4. Neural network (machine learning) - Wikipedia, accessed on April 23, 2025, https://en.wikipedia.org/wiki/Neural_network_(machine_learning)

5. What is a Neural Network? - GeeksforGeeks, accessed on April 23, 2025, https://www.geeksforgeeks.org/neural-networks-a-beginners-guide/

6. Neural Networks 101: Understanding the Basics of This Key AI Technology, accessed on April 23, 2025, https://online.nyit.edu/blog/neural-networks-101understanding-the-basics-of-key-ai-technology

7. Artificial Neural Network: Understanding the Basic Concepts without Mathematics - PMC, accessed on April 23, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC6428006/

8. A Basic Introduction to Feedforward Backpropagation Neural Networks (after Leverington, 2001) - webpages, accessed on April 23, 2025, http://www.webpages.ttu.edu/dleverin/neural_network/neural_networks.html

9. What is a Neural Network? - GeeksforGeeks, accessed on April 23, 2025, https://www.geeksforgeeks.org/neural-networks-a-beginners-guide/?ref=rp

10. Neural Networks | NNLM, accessed on April 23, 2025, https://www.nnlm.gov/guides/data-glossary/neural-networks

11. The Essential Guide to Neural Network Architectures - V7 Labs, accessed on April 23, 2025, https://www.v7labs.com/blog/neural-network-architecturesguide

12. www.ibm.com, accessed on April 23, 2025, https://www.ibm.com/think/topics/qpu#:~:text=Senior%20Editorial %20Strategist-,What%20is%20a%20QPU%20(quantum%20processing %20unit)%3F,problems%20by%20using%20quantum%20mechanics.

13. What is a QPU (Quantum Processing Unit)? - IBM, accessed on April 23, 2025, https://www.ibm.com/think/topics/qpu

14. Quantum Processing Unit (QPU) - QuEra Computing, accessed on April 23, 2025, https://www.quera.com/glossary/processing-unit

15. The Science Behind QPUs: Unlocking Quantum Computing Power - SpinQ, accessed on April 23, 2025, https://www.spinquanta.com/newsDetail/242ee26e-7f27-4547-bd8697f49c10fdd3

16. What Is a QPU? | NVIDIA Blogs, accessed on April 23, 2025, https://blogs.nvidia.com/blog/what-is-a-qpu/

17. Physical Principles Underpinning Quantum Computing - EE Times Europe, accessed on April 23, 2025, https://www.eetimes.eu/physical-principlesunderpinning-quantum-computing/

18. What Is Quantum Computing? - IBM, accessed on April 23, 2025, https://www.ibm.com/think/topics/quantum-computing

19. 1 High-Performance Computing with Quantum Processing Units - OSTI, accessed on April 23, 2025, https://www.osti.gov/servlets/purl/1357960

20. 39 High-Performance Computing with Quantum Processing Units, accessed on April 23, 2025, https://www.cse.wustl.edu/~roger/566S.s21/3007651.pdf

21. Quantum Processor Units (QPUs) - Meegle, accessed on April 23, 2025, https://www.meegle.com/en_us/topics/quantum-computing/quantumprocessor-units-qpus

22. Quantum Processing Units (QPUs): The Future of Computing - Unite.AI, accessed on April 23, 2025, https://www.unite.ai/quantum-processing-unitsthe-future-of-computing/

23. Explore Quantum Computer Systems: Types & Key Features - SpinQ, accessed on April 23, 2025, https://www.spinquanta.com/news-detail/explorequantum-computer-systems-types-key-features20250108144529

24. Top Advantages of Quantum Computers & Their Future Potential - SpinQ, accessed on April 23, 2025, https://www.spinquanta.com/news-detail/topadvantages-of-quantum-computers-their-future-potential20250207021218

25. Quantum data parallelism in quantum neural networks | Phys. Rev. Research, accessed on April 23, 2025, https://link.aps.org/doi/10.1103/PhysRevResearch.7.013177

26. The quest for a Quantum Neural Network - ResearchGate, accessed on April 23, 2025, https://www.researchgate.net/publication/265209779_The_quest_for_a_Quant um_Neural_Network

27. The quantum advantage: How quantum computing will transform machine learning - Algorithma, accessed on April 23, 2025, https://www.algorithma.se/our-latest-thinking/the-quantum-advantage-howquantum-computing-will-transform-machine-learning

28. Top Applications Of Quantum Computing for Machine Learning, accessed on April 23, 2025, https://www.quera.com/blog-posts/applications-of-quantumcomputing-for-machine-learning

29. Bridging the Worlds of Quantum Computing and Machine Learning - SIAM.org, accessed on April 23, 2025, https://www.siam.org/publications/siam-news/articles/bridging-the-worlds-ofquantum-computing-and-machine-learning/

30. Bridging the Worlds of Quantum Computing and Machine Learning - SIAM.org, accessed on April 23, 2025, https://siam.org/publications/siam-news/articles/bridging-the-worlds-ofquantum-computing-and-machine-learning/

31. The Future Of AI: Unleashing The Power Of Quantum Machine LearningForbes, accessed on April 23, 2025, https://www.forbes.com/councils/forbestechcouncil/2024/06/24/the-future-of-aiunleashing-the-power-of-quantum-machine-learning/

32. The power of quantum neural networks - ETH Zurich, accessed on April 23, 2025, https://people.math.ethz.ch/~afigalli/papers-pdf/The-power-of-quantumneural-networks.pdf

33. Quantum Neural Network for Quantum Neural Computing - PMC - PubMed Central, accessed on April 23, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC10202373/

34. Reliability Research on Quantum Neural Networks - MDPI, accessed on April 23, 2025, https://www.mdpi.com/2079-9292/13/8/1514

35. Quantum neural network | 410 Publications | 3210 Citations | Top Authors | Related Topics, accessed on April 23, 2025, https://typeset.io/topics/quantumneural-network-2zuaauwj

36. What are Quantum Neural Networks? - QuEra Computing, accessed on April 23, 2025, https://www.quera.com/glossary/quantum-neural-networks

37. Quantum neural network - Wikipedia, accessed on April 23, 2025, https://en.wikipedia.org/wiki/Quantum_neural_network

38. Quantum Neural Networks - Qiskit Machine Learning 0.8.2, accessed on April 23, 2025, https://qiskit-community.github.io/qiskit-machine-learning/tutorials/ 01_neural_networks.html

39. Optimizing quantum convolutional neural network architectures for arbitrary data dimension - Frontiers, accessed on April 23, 2025, https://www.frontiersin.org/journals/physics/articles/10.3389/fphy.2025.152918 8/full

40. Three quantum neural network architectures for multiclass image classification - SPIE Digital Library, accessed on April 23, 2025, https://ebooks.spiedigitallibrary.org/conference-proceedings-of-spie/ 13545/135450K/Three-quantum-neural-network-architectures-for-multiclassimage-classification/10.1117/12.3060077.full

41. Quantum Neural Networks with Novel Architectures - EasyChair, accessed on April 23, 2025, https://www.easychair.org/publications/preprint/rhfQ

42. Quantum Computers Will Make AI Better - Quantinuum, accessed on April 23, 2025, https://www.quantinuum.com/blog/quantum-computers-will-make-aibetter

43. Google Researchers Say Quantum Theory Suggests a Shortcut for Learning Certain Neural Networks, accessed on April 23, 2025, https://thequantuminsider.com/2025/03/31/google-researchers-say-quantumtheory-suggests-a-shortcut-for-learning-certain-neural-networks/

44. The Impact of Quantum Computing on Machine Learning - IonQ, accessed on April 23, 2025, https://ionq.com/posts/the-impact-of-quantum-computing-onmachine-learning-post

45. Quantum computing and AI: less compatible than expected? - Polytechnique Insights, accessed on April 23, 2025, https://www.polytechnique-insights.com/en/columns/science/quantumcomputing-and-ai-less-compatible-than-expected/

46. Quantum Computing: What are its challenges and potential for companies?Plain Concepts, accessed on April 23, 2025, https://www.plainconcepts.com/quantum-computing-potential-challenges/

47. Quantum Leap: Beyond the Limits of Machine Learning - Dataiku blog, accessed on April 23, 2025, https://blog.dataiku.com/quantum-leap-beyondthe-limits-of-machine-learning

48. Power and limitations of single-qubit native quantum neural networks, accessed on April 23, 2025, https://proceedings.neurips.cc/paper_files/paper/2022/file/b250de41980b58d34 d6aadc3f4aedd4c-Paper-Conference.pdf

49. What are the quantum simulation problems where AI doesn't eat quantum computing's lunch?, accessed on April 23, 2025, https://quantumcomputing.stackexchange.com/questions/40373/what-are-thequantum-simulation-problems-where-ai-doesnt-eat-quantum-computing

50. [D] Why the limitations in Quantum ML? : r/MachineLearning - Reddit, accessed on April 23, 2025, https://www.reddit.com/r/MachineLearning/comments/nf7861/d_why_the_limita tions_in_quantum_ml/

51. [D] Is quantum ML pointless? : r/MachineLearning - Reddit, accessed on April 23, 2025, https://www.reddit.com/r/MachineLearning/comments/tsffzi/d_is_quantum_ml_ pointless/

52. www.computer.org, accessed on April 23, 2025, https://www.computer.org/publications/tech-news/research/current-state-ofquantum-machine-learning/#:~:text=Currently%2C%20QML%20shows%20the %20potential,time%2C%20subtle%20issues%20present%20limitations.

53. The Current State of Quantum Machine Learning - IEEE Computer Society, accessed on April 23, 2025, https://www.computer.org/publications/tech-news/research/current-state-ofquantum-machine-learning/

54. What is Quantum machine learning ? : r/learnmachinelearning - Reddit, accessed on April 23, 2025, https://www.reddit.com/r/learnmachinelearning/comments/182ipt3/what_is_qua ntum_machine_learning/

55. Challenges In Implementing Quantum Neural Networks - FasterCapital, accessed on April 23, 2025, https://fastercapital.com/topics/challenges-inimplementing-quantum-neural-networks.html

56. Quantum Neural Networks: Concepts, Applications, and Challenges - ar5ivarXiv, accessed on April 23, 2025, https://ar5iv.labs.arxiv.org/html/2108.01468

57. Challenges and opportunities in quantum machine learning | Request PDFResearchGate, accessed on April 23, 2025, https://www.researchgate.net/publication/363596480_Challenges_and_opportu nities_in_quantum_machine_learning

58. Quantum Neural Networks: Issues, Training, and Applications - OSTI, accessed on April 23, 2025, https://www.osti.gov/servlets/purl/2337965

59. Quantum Neural Networks: Issues, Training, and Applications | Report | PNNL, accessed on April 23, 2025, https://www.pnnl.gov/publications/quantum-neuralnetworks-issues-training-and-applications

60. [D] What are challenges that Quantum Machine Learning can solve? - Reddit, accessed on April 23, 2025, https://www.reddit.com/r/MachineLearning/comments/rgk8db/d_what_are_chall enges_that_quantum_machine/

61. What are some of the open problems in quantum machine learning specially quantum neural networks? : r/QuantumComputing - Reddit, accessed on April 23, 2025, https://www.reddit.com/r/QuantumComputing/comments/w5k7l6/what_are_som e_of_the_open_problems_in_quantum/

62. Challenges and Opportunities in Quantum Machine Learning - Inspire HEP, accessed on April 23, 2025, https://inspirehep.net/literature/2643151

63. [2108.01468] Quantum Neural Networks: Concepts, Applications, and Challenges - arXiv, accessed on April 23, 2025, https://arxiv.org/abs/2108.01468

64. What is a quantum processing unit (QPU)? - Live Science, accessed on April 23, 2025, https://www.livescience.com/technology/computing/what-is-aquantum-processing-unit-qpu

65. CSIRO shows practical application for quantum machine learning, accessed on April 23, 2025, https://www.csiro.au/en/news/All/News/2025/January/CSIROshows-practical-application-for-quantum-machine-learning

66. CSIRO Shows Practical Application For Quantum Machine Learning, accessed on April 23, 2025, https://thequantuminsider.com/2025/01/29/csiro-showspractical-application-for-quantum-machine-learning/

67. The State of Quantum Computing: Where Are We Today? | Towards Data Science, accessed on April 23, 2025, https://towardsdatascience.com/thestate-of-quantum-computing-where-are-we-today-17ee19f51b1d/

68. A Beginner's Guide to Neural Networks and Deep Learning | Pathmind, accessed on April 23, 2025, http://wiki.pathmind.com/neural-network

69. www.nnlm.gov, accessed on April 23, 2025, https://www.nnlm.gov/guides/dataglossary/neural-networks#:~:text=An%20important%20characteristic%20of %20neural,may%20be%20difficult%20to%20interpret.

70. Introduction to Artificial Neural Network | Set 2 - GeeksforGeeks, accessed on April 23, 2025, https://www.geeksforgeeks.org/introduction-artificial-neuralnetwork-set-2/

71. Neural networks: Nodes and hidden layers | Machine Learning - Google for Developers, accessed on April 23, 2025, https://developers.google.com/machine-learning/crash-course/neuralnetworks/nodes-hidden-layers

72. Neural networks | Machine Learning - Google for Developers, accessed on April 23, 2025, https://developers.google.com/machine-learning/crash-course/neural-networks

73. Neural Networks: Types, Structure, and Characteristics [Development Notes]Bintime, accessed on April 23, 2025, https://bintime.com/blog/neural-networkstypes-structure-and-characteristics

74. Explained: Neural networks | MIT News | Massachusetts Institute of Technology, accessed on April 23, 2025, https://news.mit.edu/2017/explainedneural-networks-deep-learning-0414

75. 6 Top Quantum Computer Applications with Real-World Examples - SpinQ, accessed on April 23, 2025, https://www.spinquanta.com/news-detail/topquantum-computer-applications-with-real-world-examples20250113034956

76. Quantum neural networks and quantum kernels deal with nonlinearities, accessed on April 23, 2025, https://quantumcomputing.stackexchange.com/questions/35464/quantumneural-networks-and-quantum-kernels-deal-with-nonlinearities

77. Machine Learning Applications of Quantum Computing: A Review - arXiv, accessed on April 23, 2025, https://arxiv.org/pdf/2406.13262

78. 10 Quantum Computing Applications & Examples to Know | Built In, accessed on April 23, 2025, https://builtin.com/hardware/quantum-computing-applications

79. What Is A QPU? The Processing Units Of The Future - Quantum Zeitgeist, accessed on April 23, 2025, https://quantumzeitgeist.com/what-is-a-qpu-theprocessing-units-of-the-future/

80. Training Classical Neural Networks by Quantum Machine Learning - arXiv, accessed on April 23, 2025, https://arxiv.org/html/2402.16465v1

81. Applications of Quantum Computing: r/QuantumComputing - Reddit, accessed on April 23, 2025, https://www.reddit.com/r/QuantumComputing/comments/1hmf5j5/applications_ of_quantum_computing/

82. thequantuminsider.com, accessed on April 23, 2025, https://thequantuminsider.com/2025/01/29/csiro-shows-practical-applicationfor-quantum-machine-learning/#:~:text=CSIRO%20Shows%20Practical %20Application%20For%20Quantum%20Machine%20Learning,Research&text=Researchers%20from%20CSIRO%20demonstrated %20that,healthcare%2C%20agriculture%20and%20energy%20optimization.

83. Training a Quantum Neural Network, accessed on April 23, 2025, http://papers.neurips.cc/paper/2363-training-a-quantum-neural-network.pdf

84. Training a Quantum Neural Network - NIPS papers, accessed on April 23, 2025, https://papers.nips.cc/paper/2363-training-a-quantum-neural-network

85. Learning to learn with quantum neural networks | PennyLane Demos, accessed on April 23, 2025, https://pennylane.ai/qml/demos/learning2learn/

©Copyright of Ian Dinmore ACIRO

Wednesday23 April 2025

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.