"Tensor network theory"의 두 판 사이의 차이

수학노트
둘러보기로 가기 검색하러 가기
(→‎노트: 새 문단)
 
 
(같은 사용자의 중간 판 하나는 보이지 않습니다)
49번째 줄: 49번째 줄:
 
===소스===
 
===소스===
 
  <references />
 
  <references />
 +
 +
==메타데이터==
 +
===위키데이터===
 +
* ID :  [https://www.wikidata.org/wiki/Q17157163 Q17157163]
 +
===Spacy 패턴 목록===
 +
* [{'LOWER': 'tensor'}, {'LOWER': 'network'}, {'LEMMA': 'theory'}]

2021년 2월 17일 (수) 01:07 기준 최신판

노트

위키데이터

말뭉치

  1. Tensor networks are a recently developed formulation for quantum systems which enables major advances in both the conceptual understanding and the simulation of these systems.[1]
  2. The first successful tensor network and associated algorithm, the density matrix renormalization group (DMRG) was invented by group leader Steven White in 1992.[1]
  3. The tensor network group aims to push the development of tensor networks and DMRG to the next level.[1]
  4. However, tensor network techniques, along with the canonical transformation approach developed by White and Chan for quantum chemistry, offer a new approach to improving the models.[1]
  5. Pushing today’s best tensor network algorithms to work in an additional dimension would require more computing power than currently exists.[2]
  6. The approach they developed is easier to learn and use than the leading existing 2D algorithms, which means it could be adopted more easily by researchers not already acquainted with tensor networks.[2]
  7. This not a story of failure, though, says Guifre Vidal, a Perimeter researcher and pioneer of tensor network research.[2]
  8. A tensor network is a mathematical tool used to study the ways that many small objects in a system, such as its particles, combine and behave en masse.[2]
  9. This is the stage of complexity where Dr Orús joined, and at the heart of his solution for creating simulations of complex quantum phenomena, is the concept of tensor networks.[3]
  10. ‘The topic of tensor networks has exploded a lot in recent years,’ he says.[3]
  11. To visualise the structure of tensor networks, Dr Orús and his colleagues use the analogy of DNA, and how its structure determines the overall makeup of the human body.[3]
  12. Essentially, constructing the simplest tensor network first involves arranging tensors in an orderly line.[3]
  13. Tensor networks in physics can be traced back to a 1971 paper by Penrose (1).[4]
  14. We adapted these tools and discovered efficient tensor network descriptions of finite Abelian lattice gauge theories (14).[4]
  15. In the braided tensor network model, the braiding sequences to represent a Feynman gate are painfully complicated—resembling perhaps a musical score—and hence salient intuitive features are lacking.[4]
  16. The known tensor network frameworks differ strikingly from the topological underpinnings in ref.[4]
  17. Here, Pellionisz described the analysis of the sensory input into the vestibular canals as the covariant vector component of tensor network theory.[5]
  18. It is shown that the SATNS parametrization solves the convergence issues found for previous correlator-based tensor network states.[6]
  19. Tensor networks have come to provide toy models to understand these bulk-boundary correspondences, shedding light on connections between geometry and entanglement.[7]
  20. We introduce a versatile and efficient framework for studying tensor networks, extending previous tools for Gaussian matchgate tensors in 1 + 1 dimensions.[7]
  21. Within our framework, we also produce translation-invariant critical states by an efficiently contractible tensor network with the geometry of the multiscale entanglement renormalization ansatz.[7]
  22. Furthermore, we establish a link between holographic quantum error–correcting codes and tensor networks.[7]
  23. The construction of the optimal tensor network is thus a far more complex task than it is in the case of the MPS based approaches.[8]
  24. α N are obtained by contracting the virtual indices of the tensors according to the scheme of a tree tensor network (see Figure z = 2, the one-dimensional MPS-ansatz used in DMRG is recovered.[8]
  25. In our algorithmic approach to optimize the tree tensor network, we use tools similar to those used in refs 35 37 , and 38 and optimize the network site-by-site as in the DMRG.[8]
  26. Therefore,can describe a tree tensor network, i.e., they emerge from contractions of a set of tensors {,...,}, whereis a tensor with+ 1 indices, at each vertexof the network according to Figure 3 .[8]
  27. We will see that this problem can be addressed in a scalable way making use of tensor network based parameterizations for the governing equations.[9]
  28. On the other hand, we will investigate the expressive power of tensor networks in probabilistic modelling.[9]
  29. Unlike Monte Carlo techniques which have difficulty in calculating such quantities, we will demonstrate that tensor networks provide a natural framework for tackling these problems.[9]
  30. One finds that the square lattice Ising model is already an example of two-dimensional tensor network (TN), which is formed by contracting 4-leg tensors.[9]
  31. Tensor networks take a central role in quantum physics as they can provide an efficient approximation to specific classes of quantum states.[10]
  32. They expect the work to stimulate further investigations of tensor network models to capture bulk-boundary correspondences.[10]
  33. They also included previous tensor network approaches such as the "MERA' model within the present work, to highlight connections between them.[10]
  34. The team restricted the study to tensor networks that are nonunitary and real, resembling a Euclidean evolution from the bulk to boundary.[10]
  35. We are recruiting postdoctoral fellows with interests in tensor networks.[11]
  36. A tensor network is a collection of tensors with indices connected according to a network pattern.[11]
  37. We then lifted the resulting MERA tensor network representation using the symmetric lifting tensor defined in Fig.[12]
  38. A promising solution to this problem is Tensor Network Theory (TNT) that takes the approach of breaking up the information of a quantum state into a network of tensors (multidimensional arrays).[13]
  39. All the core features of tensor networks have been implemented and optimised, and a suite of MPS algorithms for 1D is now complete.[13]
  40. Tensor network states constitute a powerful machinery of numerically solving such systems, as well as analytically characterizing their properties.[14]
  41. Notions of topological order or the classification of phases can be elegantly expressed in terms of such tensor networks.[14]
  42. The underlying so-called tensor network algorithms have been developed over the past two decades but it is only now that a unified framework for these algorithms is known.[15]
  43. Sukhwinder Singh, Guifre Vidal, Tensor network states and algorithms in the presence of a global SU ( 2 ) SU(2) symmetry, Phys.[16]

소스

메타데이터

위키데이터

Spacy 패턴 목록

  • [{'LOWER': 'tensor'}, {'LOWER': 'network'}, {'LEMMA': 'theory'}]