Skip to content

Commit 4b4315a

Browse files
committed
Update week13.do.txt
1 parent 5549d7c commit 4b4315a

File tree

1 file changed

+84
-265
lines changed

1 file changed

+84
-265
lines changed

doc/src/week13/week13.do.txt

Lines changed: 84 additions & 265 deletions
Original file line numberDiff line numberDiff line change
@@ -71,309 +71,128 @@ o \textbf{Quantum Interference:} Constructive and destructive interference to en
7171
===== Challenges in Quantum Machine Learning =====
7272

7373

74-
\begin{frame}{Challenges and Limitations}
75-
\textbf{1. Quantum Hardware Limitations:}
76-
\begin{itemize}
77-
\item Noisy Intermediate-Scale Quantum (NISQ) devices.
78-
\item Decoherence and limited qubit coherence times.
79-
\end{itemize}
74+
!bblock Quantum Hardware Limitations:
75+
o Noisy Intermediate-Scale Quantum (NISQ) devices.
76+
o Decoherence and limited qubit coherence times.
77+
!eblock
8078

81-
\textbf{2. Data Encoding:}
82-
\begin{itemize}
83-
\item Efficient embedding of classical data into quantum states.
84-
\end{itemize}
79+
!bblock Data Encoding:
80+
o Efficient embedding of classical data into quantum states.
81+
!eblock
8582

86-
\textbf{3. Scalability:}
87-
\begin{itemize}
88-
\item Difficult to scale circuits to large datasets.
89-
\end{itemize}
90-
\end{frame}
83+
!bblock Scalability:
84+
o Difficult to scale circuits to large datasets.
85+
!eblock
9186

9287

88+
!split
89+
===== Classical Support Vector Machines (SVMs) =====
9390

91+
Support Vector Machines (SVM) are supervised learning algorithms used
92+
for classification tasks, and regression tasks as well. The main goal
93+
of SVMs is to find the optimal separating hyperplane (in
94+
high-dimensional space) that provides a maximum margin between
95+
classes.
9496

95-
\begin{frame}{1. Quantum Support Vector Machines (QSVM)}
96-
\textbf{Quantum Kernel Estimation:}
97-
\begin{itemize}
98-
\item Maps classical data to a quantum Hilbert space.
99-
\item Quantum kernel measures similarity in high-dimensional space.
100-
\end{itemize}
97+
!split
98+
===== First mathematical formulation of SVMs (formal details below) =====
10199

102-
\pause
103-
\textbf{Quantum Kernel:}
100+
For a dataset \((\mathbf{x}_i, y_i)\) where \(\mathbf{x}_i \in
101+
\mathbb{R}^n\) and \(y_i \in \{-1, 1\}\), the decision boundary is
102+
defined as:
103+
!bt
104104
\[
105-
K(x, x') = |\braket{\psi(x) | \psi(x')}|^2
105+
f(\mathbf{x}) = \mathbf{w} \cdot \mathbf{x} + b = 0.
106106
\]
107+
!et
107108

108-
\textbf{Advantage:}
109-
- Potentially exponential speedup over classical SVMs.
110-
\end{frame}
111-
112-
\begin{frame}{2. Quantum Neural Networks (QNNs)}
113-
\textbf{Quantum Neural Networks} replace classical neurons with parameterized quantum circuits.
114-
115-
\textbf{Key Concepts:}
116-
\begin{itemize}
117-
\item Quantum Gates as Activation Functions.
118-
\item Variational Quantum Circuits (VQCs) for optimization.
119-
\end{itemize}
120-
121-
\pause
122-
\textbf{Parameterized Quantum Circuit:}
109+
The goal is to optimize
110+
!bt
123111
\[
124-
U(\theta) = \prod_i R_y(\theta_i) \cdot CNOT \cdot R_x(\theta_i)
112+
\min_{\mathbf{w}, b} \frac{1}{2} ||\mathbf{w}||^2,
125113
\]
126-
127-
\textbf{Advantage:}
128-
- Quantum gradients enable exploration of non-convex landscapes.
129-
\end{frame}
130-
131-
\begin{frame}{3. Quantum Boltzmann Machines (QBMs)}
132-
\textbf{Quantum Boltzmann Machines} leverage quantum mechanics to sample from a probability distribution.
133-
134-
\begin{itemize}
135-
\item Quantum tunneling aids in escaping local minima.
136-
\item Quantum annealing for optimization problems.
137-
\end{itemize}
138-
139-
\pause
140-
\textbf{Quantum Hamiltonian:}
114+
!et
115+
subject to
116+
!bt
141117
\[
142-
H = -\sum_i b_i \sigma_i^z - \sum_{ij} w_{ij} \sigma_i^z \sigma_j^z
118+
y_i (\mathbf{w} \cdot \mathbf{x}_i + b) \geq 1, \quad \forall i
143119
\]
120+
!et
144121

145-
\textbf{Advantage:}
146-
- Efficient sampling in complex probability distributions.
147-
\end{frame}
148-
149-
150-
\section{Future Perspectives}
151-
\begin{frame}{Future Perspectives in QML}
152-
\textbf{1. Fault-Tolerant Quantum Computing:}
153-
\begin{itemize}
154-
\item Overcoming noise for stable quantum circuits.
155-
\end{itemize}
156-
157-
\textbf{2. Hybrid Quantum-Classical Models:}
158-
\begin{itemize}
159-
\item Combining quantum circuits with classical neural networks.
160-
\end{itemize}
122+
!split
123+
===== Kernels and more =====
161124

162-
\textbf{3. Quantum Internet:}
163-
\begin{itemize}
164-
\item Distributed quantum machine learning over quantum networks.
165-
\end{itemize}
166-
\end{frame}
125+
In classical SVMs, kernels help with non-linear data
126+
separations. Quantum computers can speed up the computation of complex
127+
kernel evaluations by efficiently simulating an inner product in an
128+
exponentially large Hilbert space.
167129

168130

169131

170-
\section{Introduction to Support Vector Machines}
132+
!split
133+
===== Quantum Support Vector Machines (QSVMs) =====
171134

172-
\subsection{Basic Concepts}
173-
Support Vector Machines (SVM) are supervised learning algorithms used for classification tasks. The main goal of SVM is to find the optimal separating hyperplane (in high-dimensional space) that provides a maximum margin between classes.
135+
!bblock Quantum Kernel Estimation:
136+
o Maps classical data to a quantum Hilbert space.
137+
o Quantum kernel measures similarity in high-dimensional space.
138+
!eblock
174139

175-
\subsection{Mathematical Formulation}
176-
For a dataset \((\mathbf{x}_i, y_i)\) where \(\mathbf{x}_i \in \mathbb{R}^n\) and \(y_i \in \{-1, 1\}\), the decision boundary is defined as:
140+
!bblock Quantum-enhanced kernel:}
141+
!bt
142+
\[
143+
K(\bm{x}, \bm{x}') = |\langle \psi(\bm{x}) \vert \psi(\bm{x}')\rangle|^2
144+
\]
145+
!et
146+
!eblock
177147

178-
\[ f(\mathbf{x}) = \mathbf{w} \cdot \mathbf{x} + b = 0 \]
148+
Here, $\vert \phi(\bm{x})\rangle$ is the quantum state encoding of
149+
the classical data $\bm{x}$.
179150

180-
The goal is optimizing:
151+
Advantage: Potentially exponential speedup over classical SVMs.
181152

182-
\[
183-
\min_{\mathbf{w}, b} \frac{1}{2} ||\mathbf{w}||^2
184-
\]
185153

186-
Subject to:
154+
!split
155+
===== Quantum Neural Networks (QNNs) =====
187156

157+
Quantum Neural Networks replace classical neurons with parameterized quantum circuits.
158+
!bblock Key Concepts:
159+
o Quantum Gates as Activation Functions.
160+
o Variational Quantum Circuits (VQCs) for optimization.
161+
!eblock
162+
!bblock
163+
\textbf{Parameterized Quantum Circuit:}
164+
!bt
188165
\[
189-
y_i (\mathbf{w} \cdot \mathbf{x}_i + b) \geq 1, \quad \forall i
166+
U(\theta) = \prod_i R_y(\theta_i) \cdot CNOT \cdot R_x(\theta_i)
190167
\]
168+
!et
169+
!eblock
170+
Advantage:Quantum gradients enable exploration of non-convex landscapes.
191171

192-
\section{Quantum Support Vector Machines}
193-
194-
\subsection{Motivation}
195-
QSVM leverages quantum computations such as quantum phase estimation and quantum matrix inversion to enhance SVM algorithms, particularly in efficiently handling large datasets and complex kernels.
196-
197-
\subsection{Quantum Kernel Estimation}
198-
In classical SVM, kernels help with non-linear data separations. Quantum computers can speed up the computation of complex kernel evaluations by efficiently simulating an inner product in an exponentially large Hilbert space.
172+
!split
173+
===== Quantum Boltzmann Machines (QBMs) =====
199174

200-
\subsubsection{Quantum Kernel Trick}
201-
Similar to classical SVM, QSVM utilizes a \textit{quantum-enhanced kernel}:
175+
Quantum Boltzmann Machines leverage quantum mechanics to sample from a probability distribution.
202176

177+
!bblock
178+
o Quantum tunneling aids in escaping local minima.
179+
o Quantum annealing for optimization problems.
180+
!eblock
181+
!bblock Quantum Hamiltonian:
182+
!bt
203183
\[
204-
K(\mathbf{x}, \mathbf{y}) = |\braket{\phi(\mathbf{x}) | \phi(\mathbf{y})}|^2
184+
H = -\sum_i b_i \sigma_i^z - \sum_{ij} w_{ij} \sigma_i^z \sigma_j^z
205185
\]
186+
!et
187+
!eblock
188+
Advantage: Efficient sampling in complex probability distributions.
206189

207-
Here, \(|\phi(\mathbf{x})\rangle\) is the quantum state encoding of the classical data \(\mathbf{x}\).
208-
209-
\section{Quantum Advantage in SVM}
210190

211-
\subsection{Quantum Speedup}
212-
Quantum algorithms like HHL (Harrow, Hassidim, and Lloyd) algorithm for solving linear equations provides polynomial speedup, exploiting quantum parallelism and entanglement.
191+
!split
192+
===== Back to math of SVMs =====
213193

214-
\subsection{Practical Considerations}
215-
Actual implementation challenges include qubit coherence times, error rates, and noise management, alongside classical preprocessing strategies to leverage quantum-enhanced procedures efficiently.
216194

217-
\section{Applications}
218-
Quantum Support Vector Machines present vast potential in finance for fraud detection, in healthcare for diagnosing conditions from large biological datasets, and broadly in any area requiring rapid classification and pattern recognition above classical limits.
219195

220-
\section{Conclusion and Future Work}
221-
QSVM embodies promising potential advancements in quantum machine learning. The road forward involves demonstrating tangible quantum advantages on existing quantum hardware, advancing error correction techniques, and developing larger qubit systems.
222196

223-
\documentclass[11pt]{article}
224-
\usepackage[margin=1in]{geometry}
225-
\usepackage{amsmath, amsfonts, amssymb, bm}
226-
\usepackage{graphicx}
227-
\usepackage{hyperref}
228-
\usepackage{physics}
229-
\usepackage{mathtools}
230-
\usepackage{braket}
231197

232-
\title{Advanced Topics in Quantum Boltzmann Machines}
233-
\author{Quantum Computing Lecture Series}
234-
\date{\today}
235-
236-
\begin{document}
237-
238-
\maketitle
239-
\tableofcontents
240-
\newpage
241-
242-
%-----------------------------------------------------------
243-
\section{Introduction to Quantum Boltzmann Machines (QBMs)}
244-
Quantum Boltzmann Machines (QBMs) are a quantum generalization of classical Boltzmann machines. They leverage quantum effects such as superposition and entanglement to model complex probability distributions. QBMs are well-suited for quantum machine learning tasks, particularly for generative modeling.
245-
246-
\subsection{Motivation}
247-
Classical Boltzmann Machines suffer from high-dimensional sampling complexity. Quantum mechanics offers exponential state space and quantum tunneling effects that can alleviate these issues.
248-
249-
\subsection{Key Concepts}
250-
\begin{itemize}
251-
\item Quantum States as Probability Distributions
252-
\item Quantum Tunneling for Escaping Local Minima
253-
\item Exponential State Space in Quantum Systems
254-
\end{itemize}
255-
256-
%-----------------------------------------------------------
257-
\section{Mathematical Framework}
258-
\subsection{Hamiltonian of a Quantum Boltzmann Machine}
259-
The quantum analog of the classical energy-based model is expressed by the Hamiltonian \( H \):
260-
261-
\begin{equation}
262-
H = H_Z + H_X,
263-
\end{equation}
264-
265-
where:
266-
\begin{align}
267-
H_Z &= -\sum_{i} b_i Z_i - \sum_{i<j} w_{ij} Z_i Z_j, \\[5pt]
268-
H_X &= -\sum_i \Gamma_i X_i.
269-
\end{align}
270-
271-
\begin{itemize}
272-
\item \( Z_i \) and \( X_i \) are Pauli operators acting on the \( i \)-th qubit.
273-
\item \( b_i \) represents the bias terms.
274-
\item \( w_{ij} \) represents the interaction between qubits.
275-
\item \( \Gamma_i \) is the transverse field strength.
276-
\end{itemize}
277-
278-
%-----------------------------------------------------------
279-
\subsection{Density Matrix and Boltzmann Distribution}
280-
The quantum Boltzmann distribution is defined by the density matrix:
281-
282-
\begin{equation}
283-
\rho = \frac{e^{-\beta H}}{Z},
284-
\end{equation}
285-
286-
where:
287-
\begin{itemize}
288-
\item \( \beta = 1/k_B T \) is the inverse temperature.
289-
\item \( Z = \Tr(e^{-\beta H}) \) is the partition function.
290-
\end{itemize}
291-
292-
In the classical case, this reduces to a Gibbs distribution.
293-
294-
%-----------------------------------------------------------
295-
\section{Training Quantum Boltzmann Machines}
296-
\subsection{Objective Function}
297-
The goal of training a QBM is to minimize the Kullback–Leibler (KL) divergence between the data distribution \( p_{\text{data}} \) and the model distribution \( p_{\theta} \):
298-
299-
\begin{equation}
300-
\mathcal{L}(\theta) = \text{KL}(p_{\text{data}} || p_{\theta}) = \sum_x p_{\text{data}}(x) \log \frac{p_{\text{data}}(x)}{p_{\theta}(x)}.
301-
\end{equation}
302-
303-
\subsection{Gradient-Based Optimization}
304-
The gradient of the loss function is computed using:
305-
306-
\begin{equation}
307-
\nabla_\theta \mathcal{L}(\theta) = \mathbb{E}_{p_{\text{data}}}[\nabla_\theta E(x)] - \mathbb{E}_{p_{\theta}}[\nabla_\theta E(x)],
308-
\end{equation}
309-
310-
where \( E(x) \) is the energy function derived from the Hamiltonian.
311-
312-
%-----------------------------------------------------------
313-
\section{Quantum Sampling Techniques}
314-
\subsection{Quantum Monte Carlo Methods}
315-
Quantum Monte Carlo (QMC) simulates quantum systems by sampling from the quantum density matrix using classical resources. However, it faces limitations due to the **sign problem**.
316-
317-
\subsection{Quantum Annealing}
318-
Quantum annealers leverage adiabatic evolution to reach low-energy states efficiently:
319-
320-
\begin{equation}
321-
H(t) = (1 - t/T) H_B + (t/T) H_P,
322-
\end{equation}
323-
324-
where:
325-
\begin{itemize}
326-
\item \( H_B \) is the mixing Hamiltonian.
327-
\item \( H_P \) is the problem Hamiltonian.
328-
\end{itemize}
329-
330-
%-----------------------------------------------------------
331-
\section{Advantages and Challenges}
332-
\subsection{Advantages}
333-
\begin{itemize}
334-
\item Quantum Parallelism
335-
\item Efficient Sampling in Complex Systems
336-
\item Potential for Exponential Speedups
337-
\end{itemize}
338-
339-
\subsection{Challenges}
340-
\begin{itemize}
341-
\item Noisy Quantum Hardware
342-
\item High Cost of Quantum Simulation
343-
\item Quantum Decoherence
344-
\end{itemize}
345-
346-
%-----------------------------------------------------------
347-
\section{Applications}
348-
\subsection{Generative Modeling}
349-
Quantum Boltzmann Machines can generate complex probability distributions with applications in:
350-
\begin{itemize}
351-
\item Image and Text Generation
352-
\item Quantum Chemistry
353-
\item Financial Modeling
354-
\end{itemize}
355-
356-
\subsection{Optimization Problems}
357-
QBMs are suitable for solving optimization problems where classical approaches suffer from local minima.
358-
359-
%-----------------------------------------------------------
360-
\section{Conclusion}
361-
Quantum Boltzmann Machines offer a promising path for leveraging quantum resources in machine learning. While hardware limitations currently restrict scalability, ongoing research in quantum algorithms and quantum hardware is likely to overcome these obstacles.
362-
363-
%-----------------------------------------------------------
364-
\section{References}
365-
\begin{thebibliography}{9}
366-
367-
\bibitem{Amin2018}
368-
M. Amin et al., "Quantum Boltzmann Machines", \textit{Physical Review X}, 2018.
369-
370-
\bibitem{Hinton1985}
371-
G. Hinton, "Boltzmann Machines: Constraints and Learning", \textit{Cognitive Science}, 1985.
372-
373-
\bibitem{Nielsen2000}
374-
M. Nielsen and I. Chuang, "Quantum Computation and Quantum Information", Cambridge University Press, 2000.
375-
376-
\end{thebibliography}
377-
378-
\end{document}
379198

0 commit comments

Comments
 (0)