Building the useChat hook

Project Source Code

Get the project source code below, and follow along with the lesson material.

Download Project Source Code

To set up the project on your local machine, please follow the directions provided in the README.md file. If you run into any issues with running the project source code, then feel free to reach out to the author in the course's Discord channel.

This lesson preview is part of the Responsive LLM Applications with Server-Sent Events course and can be unlocked immediately with a \newline Pro subscription or a single-time purchase. Already have access to this course? Log in here.

This video is available to students only
Unlock This Course

Get unlimited access to Responsive LLM Applications with Server-Sent Events, plus 70+ \newline books, guides and courses with the \newline Pro subscription.

Thumbnail for the \newline course Responsive LLM Applications with Server-Sent Events
  • [00:00 - 00:16] Welcome back, in this module we will build a chat and in this lesson we will create a react-use chat on capturing thing with corollary cover use-case. As you can see in the demo app, the use-case lets the user interact with a model as in a natural conversation.

    [00:17 - 00:32] This is a very powerful user experience as popularized by chat GPT. We are not going to start from scratch, as in previous modules we have already covered how to handle networking, how to handle streaming, how to set up a backend.

    [00:33 - 00:52] As we have followed in a modular approach we will be able to reuse what you've done before and now we simply need to create some new hook on the front end on the list. Let's look at the hook, at its dependencies and its interface. As you can see we are reusing our previous work.

    [00:53 - 01:07] First we need some logic to handle the network, that will be the use chat query hook we have already built. We will need some logic to handle streaming, that will be done with the right text-stream function we build in the previous module.

    [01:08 - 01:20] And a few types and so on. All this work was explained in the two previous modules and we are going to reuse these as is. Some of the backend will reuse exactly the same end point from a previous model .

    [01:21 - 01:40] Now let's look at the interface. Our hook, it takes as input via the dependency injection, it takes as input a hook on only the network. And it takes an init message values, it can be useful to state the starting state of your chat for instance to add a system.

    [01:41 - 01:53] Now let's look at the chat interface which is written by a hook use track. We have an input, the input is simply text here. We have two boolean including an ethera.

    [01:54 - 02:03] We have a message list which is the very list of message that are displayed on the UI. Then we have some actions.

    [02:04 - 02:15] Console to console.reconstreams.submit when the user wants to submit. Reset to restart the conversation from scratch and set input when the user types and you can write their input.

    [02:16 - 02:35] How are we going to implement this interface? Here are the states of our hook. We have some basic states of is reading, boolean to know whether it's reading a stream, input input as a string, a reset error to keep track of errors during our chat.

    [02:36 - 02:41] And then two messages. Why do we get two message states?

    [02:42 - 02:50] Because there is one that will be displayed on the UI. It's updated and it times something happens like a user action or a model completion.

    [02:51 - 02:59] And the other one is what we sent to a man's estate with all in charge on user input. So now that we've created our state, let's look at the network.

    [03:00 - 03:06] Here we made a network call to the backend and we get a reader from it. So, similar last time.

    [03:07 - 03:11] So now we got a reader from the network. We need to parse it.

    [03:12 - 03:19] It will be done with a user effect. This user effect depends on the reader. And then we call the method read text stream.

    [03:20 - 03:29] And we pass a handler to the uneven property and any time an even is passed. If it's a data event and an image event, we update the messages.

    [03:30 - 03:38] What do we do? We found the last message, an assistant message and we add the chunk at the end of the content.

    [03:39 - 03:45] And we simply need to make sure to indeed change the rest to preserve the record. And then there is still a little bit work to be done.

    [03:46 - 03:56] We need to update the states for each reading user answer as needed. And lastly, the cleanup should the component be destroyed in mid-stream will console the reader.

    [03:57 - 04:00] Now, let's take a look at the reactions. Here we have a simple reset.

    [04:01 - 04:08] We reset every state to default and we console the current stream. Console is a console of a reader.

    [04:09 - 04:18] And submit is where we create our new messages. So a user message, an assistant message and we update the message that states.

    [04:19 - 04:21] And we are done. We have created a just checked hook.

    [04:22 - 04:27] We are kept creating all the stateful logic of our use case. Let's take a look briefly at the test.

    [04:28 - 04:33] We already explained how to test such a hook in previous module. And it's exactly the same.

    [04:34 - 04:41] We are going to reuse the same logic for instance. We create a mock stream with a previous build mock stream function.

    [04:42 - 04:49] We set up a test component to run the hook. And then it's exactly as last time we create a test that we get run test.

    [04:50 - 04:58] For instance, we want to check that the loading state is incorrect. So we set up the stream, the chart, the component.

    [04:59 - 05:03] We submit, we unqueue some state. And we check that indeed the chart is loading.

    [05:04 - 05:10] So we build our use chat hook. In the next lesson, we will see how to build the UI.