This image demonstrates the Input Embedding phase of a Transformer model. The sentence “What is the capital of India?” is first broken down into tokens through a tokenization process. These tokens are then passed into an embedding model, which converts them into numerical vectors that capture semantic meaning. This is a foundational step in natural language processing, allowing models to work with text inputs in a mathematical form.