Skip to content

Commit 5549d7c

Browse files
committed
Update week13.do.txt
1 parent 61f8ead commit 5549d7c

File tree

1 file changed

+366
-3
lines changed

1 file changed

+366
-3
lines changed

doc/src/week13/week13.do.txt

Lines changed: 366 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
TITLE: Quantum Computing, Quantum Machine Learning and Quantum Information Theories
1+
TITLE: Quantum Computing and Quantum Machine Learning
22
AUTHOR: Morten Hjorth-Jensen {copyright, 1999-present|CC BY-NC} at Department of Physics, University of Oslo, Norway
33
DATE: April 23
44

@@ -7,10 +7,373 @@ DATE: April 23
77
===== Plans for the week of April 21-25 =====
88

99
!bblock
10-
o TBA
11-
o "Video of lecture TBA":"https://youtu.be/"
10+
o Quantum Machine Learning (QML)
11+
o Introduction to QML
12+
o Support Vector Machines (SVM, classical machine learning approach)
13+
o Quantum Support Vector Machine Learning (QSVM)
14+
#o "Video of lecture TBA":"https://youtu.be/"
1215
# o "Whiteboard notes":"https://github.com/CompPhysics/QuantumComputingMachineLearning/blob/gh-pages/doc/HandWrittenNotes/2024/NotesApril17.pdf"
1316
!eblock
1417

1518

19+
!split
20+
===== What is Machine Learning? =====
21+
22+
Machine Learning (ML) is the study of algorithms that improve through data experience.
23+
24+
!bblock Types of Machine Learning:
25+
o \textbf{Supervised Learning:} Labeled data for classification or regression.
26+
o \textbf{Unsupervised Learning:} No labels; discover hidden patterns.
27+
o\textbf{Reinforcement Learning:} Learning through interaction with the environment.
28+
!eblock
29+
30+
!bblock
31+
\textbf{ML Workflow:}
32+
!bt
33+
\[
34+
\text{Data} \rightarrow \text{Model Training} \rightarrow \text{Prediction}
35+
\]
36+
!et
37+
!eblock
38+
39+
40+
!split
41+
===== What is Quantum Machine Learning? =====
42+
43+
\textbf{Quantum Machine Learning (QML)} integrates quantum computing with machine learning algorithms to exploit quantum advantages.
44+
45+
!bblock Motivation:
46+
o High-dimensional Hilbert spaces for better feature representation.
47+
o Quantum parallelism for faster computation.
48+
o Quantum entanglement for richer data encoding.
49+
!eblock
50+
51+
!split
52+
===== Quantum Speedups in ML =====
53+
\textbf{Why Quantum?}
54+
!bblock
55+
o \textbf{Quantum Parallelism:} Process multiple states simultaneously.
56+
o \textbf{Quantum Entanglement:} Correlated states for richer information.
57+
o \textbf{Quantum Interference:} Constructive and destructive interference to enhance solutions.
58+
!eblock
59+
60+
\textbf{Example - Grover's Algorithm:}
61+
!bt
62+
\[
63+
\text{Quantum Search Complexity: } O(\sqrt{N}) \text{ vs. } O(N)
64+
\]
65+
!et
66+
\textbf{Advantage:}
67+
- Speedups in high-dimensional optimization and linear algebra problems.
68+
69+
70+
!split
71+
===== Challenges in Quantum Machine Learning =====
72+
73+
74+
\begin{frame}{Challenges and Limitations}
75+
\textbf{1. Quantum Hardware Limitations:}
76+
\begin{itemize}
77+
\item Noisy Intermediate-Scale Quantum (NISQ) devices.
78+
\item Decoherence and limited qubit coherence times.
79+
\end{itemize}
80+
81+
\textbf{2. Data Encoding:}
82+
\begin{itemize}
83+
\item Efficient embedding of classical data into quantum states.
84+
\end{itemize}
85+
86+
\textbf{3. Scalability:}
87+
\begin{itemize}
88+
\item Difficult to scale circuits to large datasets.
89+
\end{itemize}
90+
\end{frame}
91+
92+
93+
94+
95+
\begin{frame}{1. Quantum Support Vector Machines (QSVM)}
96+
\textbf{Quantum Kernel Estimation:}
97+
\begin{itemize}
98+
\item Maps classical data to a quantum Hilbert space.
99+
\item Quantum kernel measures similarity in high-dimensional space.
100+
\end{itemize}
101+
102+
\pause
103+
\textbf{Quantum Kernel:}
104+
\[
105+
K(x, x') = |\braket{\psi(x) | \psi(x')}|^2
106+
\]
107+
108+
\textbf{Advantage:}
109+
- Potentially exponential speedup over classical SVMs.
110+
\end{frame}
111+
112+
\begin{frame}{2. Quantum Neural Networks (QNNs)}
113+
\textbf{Quantum Neural Networks} replace classical neurons with parameterized quantum circuits.
114+
115+
\textbf{Key Concepts:}
116+
\begin{itemize}
117+
\item Quantum Gates as Activation Functions.
118+
\item Variational Quantum Circuits (VQCs) for optimization.
119+
\end{itemize}
120+
121+
\pause
122+
\textbf{Parameterized Quantum Circuit:}
123+
\[
124+
U(\theta) = \prod_i R_y(\theta_i) \cdot CNOT \cdot R_x(\theta_i)
125+
\]
126+
127+
\textbf{Advantage:}
128+
- Quantum gradients enable exploration of non-convex landscapes.
129+
\end{frame}
130+
131+
\begin{frame}{3. Quantum Boltzmann Machines (QBMs)}
132+
\textbf{Quantum Boltzmann Machines} leverage quantum mechanics to sample from a probability distribution.
133+
134+
\begin{itemize}
135+
\item Quantum tunneling aids in escaping local minima.
136+
\item Quantum annealing for optimization problems.
137+
\end{itemize}
138+
139+
\pause
140+
\textbf{Quantum Hamiltonian:}
141+
\[
142+
H = -\sum_i b_i \sigma_i^z - \sum_{ij} w_{ij} \sigma_i^z \sigma_j^z
143+
\]
144+
145+
\textbf{Advantage:}
146+
- Efficient sampling in complex probability distributions.
147+
\end{frame}
148+
149+
150+
\section{Future Perspectives}
151+
\begin{frame}{Future Perspectives in QML}
152+
\textbf{1. Fault-Tolerant Quantum Computing:}
153+
\begin{itemize}
154+
\item Overcoming noise for stable quantum circuits.
155+
\end{itemize}
156+
157+
\textbf{2. Hybrid Quantum-Classical Models:}
158+
\begin{itemize}
159+
\item Combining quantum circuits with classical neural networks.
160+
\end{itemize}
161+
162+
\textbf{3. Quantum Internet:}
163+
\begin{itemize}
164+
\item Distributed quantum machine learning over quantum networks.
165+
\end{itemize}
166+
\end{frame}
167+
168+
169+
170+
\section{Introduction to Support Vector Machines}
171+
172+
\subsection{Basic Concepts}
173+
Support Vector Machines (SVM) are supervised learning algorithms used for classification tasks. The main goal of SVM is to find the optimal separating hyperplane (in high-dimensional space) that provides a maximum margin between classes.
174+
175+
\subsection{Mathematical Formulation}
176+
For a dataset \((\mathbf{x}_i, y_i)\) where \(\mathbf{x}_i \in \mathbb{R}^n\) and \(y_i \in \{-1, 1\}\), the decision boundary is defined as:
177+
178+
\[ f(\mathbf{x}) = \mathbf{w} \cdot \mathbf{x} + b = 0 \]
179+
180+
The goal is optimizing:
181+
182+
\[
183+
\min_{\mathbf{w}, b} \frac{1}{2} ||\mathbf{w}||^2
184+
\]
185+
186+
Subject to:
187+
188+
\[
189+
y_i (\mathbf{w} \cdot \mathbf{x}_i + b) \geq 1, \quad \forall i
190+
\]
191+
192+
\section{Quantum Support Vector Machines}
193+
194+
\subsection{Motivation}
195+
QSVM leverages quantum computations such as quantum phase estimation and quantum matrix inversion to enhance SVM algorithms, particularly in efficiently handling large datasets and complex kernels.
196+
197+
\subsection{Quantum Kernel Estimation}
198+
In classical SVM, kernels help with non-linear data separations. Quantum computers can speed up the computation of complex kernel evaluations by efficiently simulating an inner product in an exponentially large Hilbert space.
199+
200+
\subsubsection{Quantum Kernel Trick}
201+
Similar to classical SVM, QSVM utilizes a \textit{quantum-enhanced kernel}:
202+
203+
\[
204+
K(\mathbf{x}, \mathbf{y}) = |\braket{\phi(\mathbf{x}) | \phi(\mathbf{y})}|^2
205+
\]
206+
207+
Here, \(|\phi(\mathbf{x})\rangle\) is the quantum state encoding of the classical data \(\mathbf{x}\).
208+
209+
\section{Quantum Advantage in SVM}
210+
211+
\subsection{Quantum Speedup}
212+
Quantum algorithms like HHL (Harrow, Hassidim, and Lloyd) algorithm for solving linear equations provides polynomial speedup, exploiting quantum parallelism and entanglement.
213+
214+
\subsection{Practical Considerations}
215+
Actual implementation challenges include qubit coherence times, error rates, and noise management, alongside classical preprocessing strategies to leverage quantum-enhanced procedures efficiently.
216+
217+
\section{Applications}
218+
Quantum Support Vector Machines present vast potential in finance for fraud detection, in healthcare for diagnosing conditions from large biological datasets, and broadly in any area requiring rapid classification and pattern recognition above classical limits.
219+
220+
\section{Conclusion and Future Work}
221+
QSVM embodies promising potential advancements in quantum machine learning. The road forward involves demonstrating tangible quantum advantages on existing quantum hardware, advancing error correction techniques, and developing larger qubit systems.
222+
223+
\documentclass[11pt]{article}
224+
\usepackage[margin=1in]{geometry}
225+
\usepackage{amsmath, amsfonts, amssymb, bm}
226+
\usepackage{graphicx}
227+
\usepackage{hyperref}
228+
\usepackage{physics}
229+
\usepackage{mathtools}
230+
\usepackage{braket}
231+
232+
\title{Advanced Topics in Quantum Boltzmann Machines}
233+
\author{Quantum Computing Lecture Series}
234+
\date{\today}
235+
236+
\begin{document}
237+
238+
\maketitle
239+
\tableofcontents
240+
\newpage
241+
242+
%-----------------------------------------------------------
243+
\section{Introduction to Quantum Boltzmann Machines (QBMs)}
244+
Quantum Boltzmann Machines (QBMs) are a quantum generalization of classical Boltzmann machines. They leverage quantum effects such as superposition and entanglement to model complex probability distributions. QBMs are well-suited for quantum machine learning tasks, particularly for generative modeling.
245+
246+
\subsection{Motivation}
247+
Classical Boltzmann Machines suffer from high-dimensional sampling complexity. Quantum mechanics offers exponential state space and quantum tunneling effects that can alleviate these issues.
248+
249+
\subsection{Key Concepts}
250+
\begin{itemize}
251+
\item Quantum States as Probability Distributions
252+
\item Quantum Tunneling for Escaping Local Minima
253+
\item Exponential State Space in Quantum Systems
254+
\end{itemize}
255+
256+
%-----------------------------------------------------------
257+
\section{Mathematical Framework}
258+
\subsection{Hamiltonian of a Quantum Boltzmann Machine}
259+
The quantum analog of the classical energy-based model is expressed by the Hamiltonian \( H \):
260+
261+
\begin{equation}
262+
H = H_Z + H_X,
263+
\end{equation}
264+
265+
where:
266+
\begin{align}
267+
H_Z &= -\sum_{i} b_i Z_i - \sum_{i<j} w_{ij} Z_i Z_j, \\[5pt]
268+
H_X &= -\sum_i \Gamma_i X_i.
269+
\end{align}
270+
271+
\begin{itemize}
272+
\item \( Z_i \) and \( X_i \) are Pauli operators acting on the \( i \)-th qubit.
273+
\item \( b_i \) represents the bias terms.
274+
\item \( w_{ij} \) represents the interaction between qubits.
275+
\item \( \Gamma_i \) is the transverse field strength.
276+
\end{itemize}
277+
278+
%-----------------------------------------------------------
279+
\subsection{Density Matrix and Boltzmann Distribution}
280+
The quantum Boltzmann distribution is defined by the density matrix:
281+
282+
\begin{equation}
283+
\rho = \frac{e^{-\beta H}}{Z},
284+
\end{equation}
285+
286+
where:
287+
\begin{itemize}
288+
\item \( \beta = 1/k_B T \) is the inverse temperature.
289+
\item \( Z = \Tr(e^{-\beta H}) \) is the partition function.
290+
\end{itemize}
291+
292+
In the classical case, this reduces to a Gibbs distribution.
293+
294+
%-----------------------------------------------------------
295+
\section{Training Quantum Boltzmann Machines}
296+
\subsection{Objective Function}
297+
The goal of training a QBM is to minimize the Kullback–Leibler (KL) divergence between the data distribution \( p_{\text{data}} \) and the model distribution \( p_{\theta} \):
298+
299+
\begin{equation}
300+
\mathcal{L}(\theta) = \text{KL}(p_{\text{data}} || p_{\theta}) = \sum_x p_{\text{data}}(x) \log \frac{p_{\text{data}}(x)}{p_{\theta}(x)}.
301+
\end{equation}
302+
303+
\subsection{Gradient-Based Optimization}
304+
The gradient of the loss function is computed using:
305+
306+
\begin{equation}
307+
\nabla_\theta \mathcal{L}(\theta) = \mathbb{E}_{p_{\text{data}}}[\nabla_\theta E(x)] - \mathbb{E}_{p_{\theta}}[\nabla_\theta E(x)],
308+
\end{equation}
309+
310+
where \( E(x) \) is the energy function derived from the Hamiltonian.
311+
312+
%-----------------------------------------------------------
313+
\section{Quantum Sampling Techniques}
314+
\subsection{Quantum Monte Carlo Methods}
315+
Quantum Monte Carlo (QMC) simulates quantum systems by sampling from the quantum density matrix using classical resources. However, it faces limitations due to the **sign problem**.
316+
317+
\subsection{Quantum Annealing}
318+
Quantum annealers leverage adiabatic evolution to reach low-energy states efficiently:
319+
320+
\begin{equation}
321+
H(t) = (1 - t/T) H_B + (t/T) H_P,
322+
\end{equation}
323+
324+
where:
325+
\begin{itemize}
326+
\item \( H_B \) is the mixing Hamiltonian.
327+
\item \( H_P \) is the problem Hamiltonian.
328+
\end{itemize}
329+
330+
%-----------------------------------------------------------
331+
\section{Advantages and Challenges}
332+
\subsection{Advantages}
333+
\begin{itemize}
334+
\item Quantum Parallelism
335+
\item Efficient Sampling in Complex Systems
336+
\item Potential for Exponential Speedups
337+
\end{itemize}
338+
339+
\subsection{Challenges}
340+
\begin{itemize}
341+
\item Noisy Quantum Hardware
342+
\item High Cost of Quantum Simulation
343+
\item Quantum Decoherence
344+
\end{itemize}
345+
346+
%-----------------------------------------------------------
347+
\section{Applications}
348+
\subsection{Generative Modeling}
349+
Quantum Boltzmann Machines can generate complex probability distributions with applications in:
350+
\begin{itemize}
351+
\item Image and Text Generation
352+
\item Quantum Chemistry
353+
\item Financial Modeling
354+
\end{itemize}
355+
356+
\subsection{Optimization Problems}
357+
QBMs are suitable for solving optimization problems where classical approaches suffer from local minima.
358+
359+
%-----------------------------------------------------------
360+
\section{Conclusion}
361+
Quantum Boltzmann Machines offer a promising path for leveraging quantum resources in machine learning. While hardware limitations currently restrict scalability, ongoing research in quantum algorithms and quantum hardware is likely to overcome these obstacles.
362+
363+
%-----------------------------------------------------------
364+
\section{References}
365+
\begin{thebibliography}{9}
366+
367+
\bibitem{Amin2018}
368+
M. Amin et al., "Quantum Boltzmann Machines", \textit{Physical Review X}, 2018.
369+
370+
\bibitem{Hinton1985}
371+
G. Hinton, "Boltzmann Machines: Constraints and Learning", \textit{Cognitive Science}, 1985.
372+
373+
\bibitem{Nielsen2000}
374+
M. Nielsen and I. Chuang, "Quantum Computation and Quantum Information", Cambridge University Press, 2000.
375+
376+
\end{thebibliography}
377+
378+
\end{document}
16379

0 commit comments

Comments
 (0)