You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: docs/source/tutorials/tfim_vqe.ipynb
+1-1
Original file line number
Diff line number
Diff line change
@@ -30,7 +30,7 @@
30
30
"source": [
31
31
"## Background\n",
32
32
"\n",
33
-
"Baiscally, we train a parameterized quantum circuit with repetions of $e^{i\\theta} ZZ$ and $e^{i\\theta X}$ layers as $U(\\rm{\\theta})$. And the objective to be minimized is this task is $\\mathcal{L}(\\rm{\\theta})=\\langle 0^n\\vert U(\\theta)^\\dagger H U(\\theta)\\vert 0^n\\rangle$. The Hamiltonian is from TFIM as $H = \\sum_{i} Z_iZ_{i+1} -\\sum_i X_i$."
33
+
"Basically, we train a parameterized quantum circuit with repetitions of $e^{i\\theta} ZZ$ and $e^{i\\theta X}$ layers as $U(\\rm{\\theta})$. And the objective to be minimized is this task is $\\mathcal{L}(\\rm{\\theta})=\\langle 0^n\\vert U(\\theta)^\\dagger H U(\\theta)\\vert 0^n\\rangle$. The Hamiltonian is from TFIM as $H = \\sum_{i} Z_iZ_{i+1} -\\sum_i X_i$."
Copy file name to clipboardexpand all lines: docs/source/tutorials/tfim_vqe_cn.ipynb
+25-27
Original file line number
Diff line number
Diff line change
@@ -11,26 +11,26 @@
11
11
"cell_type": "markdown",
12
12
"metadata": {},
13
13
"source": [
14
-
"## Overview\n",
14
+
"## 概述\n",
15
15
"\n",
16
-
"The main aim of this tutorial is not about the physics perspective of VQE, instead we demonstrate\n",
17
-
"the main ingredients of tensorcircuit by this simple VQE toy model. "
16
+
"本教程的主要目的不是关于 VQE 的物理观点,而是我们通过演示\n",
17
+
"这个简单的 VQE 玩具模型来了解张量电路的主要成分。"
18
18
]
19
19
},
20
20
{
21
21
"cell_type": "markdown",
22
22
"metadata": {},
23
23
"source": [
24
-
"## Background\n",
24
+
"## 背景\n",
25
25
"\n",
26
-
"Baiscally, we train a parameterized quantum circuit with repetions of $e^{i\\theta} ZZ$ and $e^{i\\theta X}$ layers as $U(\\rm{\\theta})$. And the objective to be minimized is this task is $\\mathcal{L}(\\rm{\\theta})=\\langle 0^n\\vert U(\\theta)^\\dagger H U(\\theta)\\vert 0^n\\rangle$. The Hamiltonian is from TFIM as $H = \\sum_{i} Z_iZ_{i+1} -\\sum_i X_i$."
"To train the parameterized circuit, we should utilize the gradient information $\\frac{\\partial \\mathcal{L}}{\\partial \\rm{\\theta}}$ with gradient descent.\n",
186
-
"We also use ``jit`` to wrap the value and grad function for a substantial speed up. Note how (1, 2) args of ``vqe_tfim`` is labelled as static since they are just integers for qubit number and layer number instead of tensors."
"我们还使用 ``jit`` 来包装 value 和 grad 函数以显着加快速度。 注意 ``vqe_tfim`` 的 (1, 2) args 是如何被标记为静态的,因为它们只是量子比特数和层数的整数,而不是张量。"
187
187
]
188
188
},
189
189
{
@@ -252,9 +252,9 @@
252
252
"cell_type": "markdown",
253
253
"metadata": {},
254
254
"source": [
255
-
"### batched VQE example\n",
255
+
"### 批处理 VQE 示例\n",
256
256
"\n",
257
-
"We can even run a batched version of VQE optimization, namely, we simutaneously optimize parameterized circuit for different random initializations, so that we can try best to avoid local minimum be locate the best of the converged energies."
"We can change the backends at runtime without even changing one line of the code!\n",
363
-
"\n",
364
-
"However, in normal user cases, we strongly recommend the users stick to one backend in one jupyter or python scripts.\n",
365
-
"One can enjoy the facility provided by other backends by changing the ``set_backend`` line and running the same script again. This approach is much safer than using multiple backends in the same file unless you know the lower level details of tensorcircuit enough."
"The higher level API under the namespace of ``tensorcircuit`` provides a unified framework to do linear algebra and automatic differentiation which is backend agnostic.\n",
471
+
"### 更低层的 API\n",
475
472
"\n",
476
-
"One may also use the related APIs (ops, AD related, jit related) directly provided by tensorflow or jax, as long as one is ok to stick with one fixed backend. See tensorflow backend example below.\n"
473
+
"`TensorCircuit` 命名空间下的更高级别 API 提供了一个统一的框架来进行线性代数和自动微分,这与后端无关。\n",
0 commit comments