LCEL
In our last lesson we introduced Composition to create new Prompt Templates. We used Composition to switch our Prompt between humor and serious mode. But your Chain still only has one Prompt Template and it can only be one or the other during a run. Sometimes it might be useful to switch Prompts dynamically during runtime. It could be based on a condition such as detecting user intent. In this section we will go in depth on Routing. As it turns out, Routing applies to more than Prompt Templates. Routing is done the same way for every Langchain Component.
What is covered here is inseparable from the Langchain Expression Language (LCEL) which is a powerful syntax for creating Chains.
To understand LCEL let me provide a bit of historical context... If you didn't get the joke it's alright it was pretty lame.
Anyways, Langchain started off providing very powerful Chains that worked out of the box. These Chains still exist and you can find them in the docs. But they were often implemented differently per Chain and it was difficult to know was happening under the hood and extend them. It was cool for prototyping and looking at what others have built but there wasn't a standardized way to define Chains.
LCEL gives us a lower level of abstraction where we have granular control over how different parts of Chains interface. If you want to get a powerful Chain working as soon as possible you can check out the premade options. They are also good for inspiration.
But mastering LCEL would benefit you more in the long term it is like a programming language for Prompt Engineering.
Components#
Langchain Components are a part of the LCEL abstraction. Let's take a look a the specific Components in LCEL and see how they interact.
Here is a table of Components with their input and output types. We will touch on some of these in later modules it might be useful to bookmark this table.
Component | Input | Output |
Prompt | Object | PromptValue |
Retriever | Single string | List of documents |
LLM | Single string, list of chat messages or PromptValue | String |
ChatModel | Single string, list of chat messages or PromptValue | ChatMessage |
Tool | Single string, or object, depending on the tool | Depends on the tool |
OutputParser | The output of an LLM or ChatModel | Depends on the parser |
We have called them Chains but to be more technical they are "Sequences". A Sequence is a sequential ordering of Runnables. Like an array. It can have any number of Runnables and they will execute serially - one after the other. A Runnable is an instance of a Component. The ordering of Runnables is fluid and it's up to you to match up the inputs and outputs.
For reference our current Sequence looks like this.

In addition to Components, there are other classes you can include in a Sequence to Route the Runnables. Routing is useful because it introduces a way to programmatically affect the Sequence. If this is confusing, don't worry we have plenty of examples below. You can follow along by going to api/lesson_02.03/index.ts
in the course project and selecting the Routing option in the chat selector.
RunnableSequence
#
Before we begin Routing let's take a moment to introduce a second syntax for creating Chains - RunnableSequence
. So far we have used the pipe
syntax and it looks clean for smaller Chains but it can quickly get confusing. The RunnableSequence
syntax helps me visualize the sequential nature of Chains better and I personally prefer it with more complex Chains.
Here is what our Chain looks like with a RunnableSequence
.
const chain = RunnableSequence.from([
Branching#
The first Routing method to learn is Branching. A RunnableBranch
dynamically selects from a list of Runnables based on cases.
Runnables used for Routing forward all of their inputs by default. So all options in the Branch receive the inputs of the Branch.
This is what a Branch would look like in our Sequence. At the Branch the Sequence dynamically chooses a Runnable to take it's place.

RunnableBranch
#
Here is what a RunnableBranch
looks like. The Branch takes in an array of pairs where the first element is the case and the second is the Runnable. The final element is a default case.
To demonstrate this we need some Runnables. We can use Prompt Templates.
const generalPrompt = PromptTemplate.fromTemplate(`You are a helpful assistant beep boop. {currentMessage}`)
Next we can use a helper function to return a boolean. In this case the function returns true with a 30% chance. There is also a debug statement to log the outcome.
The logs will show in your serverside console.
console.log(`${branchName ?? "default"}: ${result}`)
You can add the Branch to the Sequence like this.
const chain = branchPrompt.pipe(model).pipe(outputParser);
Does it look familiar? We have already been piping prompt -> model -> output parser. So it shouldn't be much of a surprise.
RunnableBranch
works for any Runnable Langchain Components and is very useful for dynamically routing Chains. You can even mix different Components in the same RunnableBranch
if you wish. But be certain that the inputs and outputs are compatible with the rest of your Chain.
If we look closer at Branches. We'll find that they work like switch statements. The first case that is true will return a Runnable associated with the case. Otherwise the default is returned. RunnableBranch
will always returns a single Runnable.
Be aware that order matters. If you are picking by generating numbers this is actually not an even distribution. We'll fix this later.

Mapping#
Another Routing method is "Mapping". A RunnableMap
contains an object with specified keys. Each key holds a value of a Runnable. The inputs to RunnableMap
are given to each Runnable and they return their respective values in a response object with the original keys.
RunnableMap
#
This lesson preview is part of the Langchain.js Bootcamp course and can be unlocked immediately with a \newline Pro subscription or a single-time purchase. Already have access to this course? Log in here.
Get unlimited access to Langchain.js Bootcamp, plus 70+ \newline books, guides and courses with the \newline Pro subscription.
