Stochastic Process Note 6
Click in for more details
Stochastic Process Note 5
Click in for more details
Stochastic Process Note 4
Click in for more details
Stochastic Process Note 1
本篇开始记录学习随机过程的笔记
A Brief to LangChain
LangChain
LangChain is a platform and a framework for developers to build LLM based applications. The illustration above shows what LangChain as a platform does in an LLM based QA application, and its basic constitution.
The ellipsoid block in the illustration are functions provided in LangChain package, and the square blocks are the type of information or the generated results in the process. (Not all the functions / modules are provided solely by LangChain, but by open source frameworks, ...
Building a Dormitory Webserver from scratch
Building a Dormitory Web Server from ScratchA few months ago, I ventured into hosting my personal blog on my first ever cloud server. The affordability was certainly attractive, but the server’s limited capabilities couldn’t handle too many tasks or support a development environment. However, with a PC and a spare laptop in my dormitory, both equipped with a development environment, the idea of setting up a web server that could be accessed over the school’s LAN seemed like a plausible solution. ...
Generative Models
Generative ModelsIntroduction for Generative ModelsDefinitionIn definition, Generative models focus on image/pixel X, and tell the probability of X existing/occurring in the dataset. The mathematical principle is instead of training a model that tells a probability distribution of the labels, it tells the probability distribution of images, namely $P_X(x)$.
It is different from conditional generative model(below), the model has learn the features of images themselves, without conditi ...
Attention Learning
Attention Mechanism Learning NotesFundamentals for Attention MechanismVanilla Seq2seq RNNsIn previous lecture, we talk about seq2seq RNN model.
Specifically, the vanilla version is to transform the output hidden state from the encoder model into two part ($s_0$ and $c$ ) and then send them to the input side of the decoder model.
Typically, the decoder will use $c$ as a context vector and view it as having stored all the information of the encoder model. Thus $c$ is going to be one of the input ...