임베딩,embedding

Difference between r1.11 and the current

@@ -1,7 +1,17 @@
ML/NLP, 수학, typography([[폰트,font]]), .... 등에 다양하게 쓰이는 단어

ML/NLP:
[[word_embedding]]
{
MKL-reciprocally
'''임베딩,embedding'''
[[언임베딩,unembedding]] =언임베딩,unembedding =,unembedding 언임베딩 unembedding { WtEn:unembedding } //unembedding .... Ggl:unembedding Bing:unembedding
[[어텐션,attention]]
[[MLP,multilayer_perceptron]] or multi-layer_perceptron
 
[[word_embedding]] =,word_embedding . word_embedding
{
'''word embedding'''
rel
[[원핫,one-hot]]
[[단어,word]]
@@ -14,6 +24,10 @@
https://colah.github.io/posts/2014-07-NLP-RNNs-Representations/
[[심층학습,deep_learning]]과 연관지어 설명

} // word embedding ... NN:"word embedding" Ggl:"word embedding" Bing:"word embedding"
 
} // ML, NLP에서 embedding
수학외에도
NdEn:embedding Ndict:embedding Naver:embedding Ggl:embedding

@@ -26,22 +40,20 @@
https://en.wikipedia.org/wiki/Embedding
[[graph_embedding]] =,graph_embedding =,graph_embedding . graph_embedding
{
graph embedding
'''graph embedding'''
~~WtEn:graph_embedding x [[Date(2023-08-25T19:34:23)]]~~
https://en.wikipedia.org/wiki/Graph_embedding - In topological graph theory, { https://en.wikipedia.org/wiki/Topological_graph_theory }
rel. [[평면그래프,planar_graph]]
Ndict:"graph embedding" - 평면에의 매장 언급만 있음 (수백) 
Ggl:"graph embedding" 
"graph embedding"
}
Up: [[그래프,graph]]
} // "graph embedding" .... Ndict:"graph embedding" - 평면에의 매장 언급만 있음 (수백) / Ggl:"graph embedding"
[[order_embedding]] =,order_embedding =,order_embedding . order_embedding
{
order embedding
'''order embedding'''
https://en.wikipedia.org/wiki/Order_embedding
~~[[WtEn:order_embedding]] x [[Date(2023-08-25T19:34:23)]]~~
~~Ndict:"order embedding" x [[Date(2023-08-25T19:34:23)]]~~
Ggl:"order embedding" 
}
Up: [[순서,order]]
}//order embedding .... Bing:"order embedding" Ggl:"order embedding"
see also and merge: [[수학,math#s-1]]

[[폰트,font]]:
@@ -89,3 +101,5 @@
Bing:
YouTube:
Srch:
 
NN:embedding



ML/NLP, 수학, typography(폰트,font), .... 등에 다양하게 쓰이는 단어

ML/NLP:
{
MKL-reciprocally
임베딩,embedding
언임베딩,unembedding =언임베딩,unembedding =,unembedding 언임베딩 unembedding { WtEn:unembedding } //unembedding .... Ggl:unembedding Bing:unembedding
어텐션,attention
MLP,multilayer_perceptron or multi-layer_perceptron

word_embedding =,word_embedding . word_embedding
{
word embedding

rel
원핫,one-hot
단어,word
특징,feature?
전처리,preprocessing인가? always? 자연어,natural_language를 기계가 이해할 수 있으려면 벡터공간,vector_space에 embed해야(ie 단어,word 문장 뭐뭐 등등을, 벡터공간의 원소인 벡터,vector로 만들어야)하는데... chk

links en

Deep Learning, NLP, and Representations - colah's blog
https://colah.github.io/posts/2014-07-NLP-RNNs-Representations/
심층학습,deep_learning과 연관지어 설명

} // word embedding ... NN:word embedding Ggl:word embedding Bing:word embedding

} // ML, NLP에서 embedding




수학:
KmsE:embedding
https://en.wikipedia.org/wiki/Embedding
graph_embedding =,graph_embedding =,graph_embedding . graph_embedding
{
graph embedding
WtEn:graph_embedding x 2023-08-26
https://en.wikipedia.org/wiki/Graph_embedding - In topological graph theory, { https://en.wikipedia.org/wiki/Topological_graph_theory }
Up: 그래프,graph
} // "graph embedding" .... Ndict:graph embedding - 평면에의 매장 언급만 있음 (수백) / Ggl:graph embedding
order_embedding =,order_embedding =,order_embedding . order_embedding
{
order embedding
https://en.wikipedia.org/wiki/Order_embedding
WtEn:order_embedding x 2023-08-26
Ndict:order embedding x 2023-08-26
Up: 순서,order
}//order embedding .... Bing:order embedding Ggl:order embedding
see also and merge: 수학,math#s-1




AKA imbedding


TMP WORKING 2023-08-27
KmsE:embedding - 일단 embedding은 '매장 매입' / but 묻기 매장 매입 끼워넣기 몰입 - 이 모두 보임.
KpsE:embedding - 끼워넣기
KcsE:embedding x
NdEn:embedding영한사전: "어떤 위상(位相) 공간에서 다른 위상 공간으로의 동상 사상(同相寫像)"
WtEn:embedding
Foldoc:embedding - good
WpSimple:embedding x
WpEn:embedding



Ndict:
Google:
Bing:
YouTube:
Srch: