"사영작용소"의 두 판 사이의 차이

수학노트
둘러보기로 가기 검색하러 가기
(→‎노트: 새 문단)
 
(→‎메타데이터: 새 문단)
 
48번째 줄: 48번째 줄:
 
===소스===
 
===소스===
 
  <references />
 
  <references />
 +
 +
== 메타데이터 ==
 +
 +
===위키데이터===
 +
* ID :  [https://www.wikidata.org/wiki/Q519967 Q519967]
 +
===Spacy 패턴 목록===
 +
* [{'LEMMA': 'projection'}]

2020년 12월 28일 (월) 08:11 기준 최신판

노트

말뭉치

  1. Thus, there exists a basis in which(more precisely, the corresponding orthogonal projection matrix) has the formwhereis the rank of.[1]
  2. The factorcorresponds to the maximal invariant subspace on whichacts as an orthogonal projection (so thatitself is orthogonal if and only if) and the-blocks correspond to the oblique components.[1]
  3. Applying projection, we getby the properties of the dot product of parallel and perpendicular vectors.[1]
  4. ( {\bf A}^{\mathrm T} {\bf A} \right)^{-1} {\bf A}^{\mathrm T} \) represents the orthogonal projection of \( \mathbb{R}^m \) onto the range of A (span of the column space).[1]
  5. Now, a projection, I'm going to give you just a sense of it, and then we'll define it a little bit more precisely.[2]
  6. Either of those are how I think of the idea of a projection.[2]
  7. I think the shadow is part of the motivation for why it's even called a projection, right?[2]
  8. And one thing we can do is, when I created this projection-- let me actually draw another projection of another line or another vector just so you get the idea.[2]
  9. Typically, a vector projection is denoted in a bold font (e.g. a 1 ), and the corresponding scalar projection with normal font (e.g. a 1 ).[3]
  10. The scalar projection a on b is a scalar which has a negative sign if 90 degrees < θ ≤ 180 degrees.[3]
  11. It coincides with the length ‖c‖ of the vector projection if the angle is smaller than 90°.[3]
  12. The orthogonal projection can be represented by a projection matrix.[3]
  13. x W is called the orthogonal projection of x onto W .[4]
  14. The following theorem gives a method for computing the orthogonal projection onto a column space.[4]
  15. Example (Orthogonal projection onto a line) Let L = Span { u } be a line in R n and let x be a vector in R n .[4]
  16. Span { u } , our formula for the projection can be derived very directly and simply.[4]
  17. For the technical drawing concept, see Orthographic projection .[5]
  18. Though abstract, this definition of "projection" formalizes and generalizes the idea of graphical projection.[5]
  19. Let W {\displaystyle W} be a finite dimensional vector space and P {\displaystyle P} be a projection on W {\displaystyle W} .[5]
  20. The operator Q {\displaystyle Q} is also a projection as the range and kernel of P {\displaystyle P} become the kernel and range of Q {\displaystyle Q} and vice versa.[5]
  21. A projection matrix is an square matrix that gives a vector space projection from to a subspace .[6]
  22. Any vector in is fixed by the projection matrix for any in .[6]
  23. An element is called projection if and .[6]
  24. For example, the real function defined by on and on is a projection in the -algebra , where is assumed to be disconnected with two components and .[6]
  25. This is why \(P\) is referred to as a projection operator.[7]
  26. In the case of a projection operator , this implies that there is a square matrix that, once post-multiplied by the coordinates of a vector , gives the coordinates of the projection of onto along .[8]
  27. First of all, we have that because the projection operator preserves the first coordinate and annihilates the other two (when coordinates are expressed with respect to ).[8]
  28. We start from the hypothesis that is a projection matrix.[8]
  29. Therefore, it is the matrix of the projection operator that projects vectors of into along .[8]
  30. When the answer is “no”, the quantity we compute while testing turns out to be very useful: it gives the orthogonal projection of that vector onto the span of our orthogonal set.[9]
  31. The first efficiently updates sample matrices to avoid computing new randomized projections.[10]
  32. Projection matrices essentially simplify the dimensionality of some space, by casting points onto a lower-dimensional plane.[11]
  33. It should be clear now that linear projection and linear regression are connected – but it is probably less clear why this holds.[11]
  34. From our discussion in Section 3.1, we know that the “best” vector is the orthogonal projection from the column space to the vector \(y\).[11]
  35. Because we know that the orthogonal projection of \(y\) onto the column space minimises the error between our prediction \(\hat{y}\) and the observed outcome vector \(y\).[11]
  36. I came across a nice equivalence between the standard index notation for vectors and the vector projection operator.[12]
  37. The theory of image restoration by projection onto convex sets can also be applied to the restoration of vector fields.[13]
  38. One of the properties, divergence freedom, is considered, and the theory and numerical implementation of its projection operator are presented.[13]
  39. This operator is called a projection operator because all x' are in the direction of x, and the length of x' equals the component of x in the x, direction, namely (x1, *).[14]
  40. We first consider orthogonal projection onto a line.[15]
  41. The orthogonal projection of v → {\displaystyle {\vec {v}}} onto the line spanned by a nonzero s → {\displaystyle {\vec {s}}\,} is this vector.[15]
  42. The picture above with the stick figure walking out on the line until v → {\displaystyle {\vec {v}}} 's tip is overhead is one way to think of the orthogonal projection of a vector onto a line.[15]
  43. Finally, another useful way to think of the orthogonal projection is to have the person stand not on the line, but on the vector that is to be projected to the line.[15]
  44. We go to the rest frame and try to find a projection operator in a covariant form.[16]

소스

메타데이터

위키데이터

Spacy 패턴 목록

  • [{'LEMMA': 'projection'}]