Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In Chainlit version 2.0.1, why is the Langchain CoT (Chain of Thought) process not displayed? #1683

Open
HwJhx opened this issue Jan 13, 2025 · 2 comments
Labels
bug Something isn't working needs-triage

Comments

@HwJhx
Copy link

HwJhx commented Jan 13, 2025

Describe the bug
In Chainlit version 2.0.1, why is the Langchain CoT (Chain of Thought) process not displayed?

To Reproduce
Steps to reproduce the behavior:

  1. OS: [Windows 10 x64]
  2. Install ChainLit V2.0.1
Name: chainlit
Version: 2.0.1
Summary: Build Conversational AI.
  1. My config.toml:
[project]
# Whether to enable telemetry (default: true). No personal data is collected.
enable_telemetry = true


# List of environment variables to be provided by each user to use the app.
user_env = []

# Duration (in seconds) during which the session is saved when the connection is lost
session_timeout = 3600

# Duration (in seconds) of the user session expiry
user_session_timeout = 1296000  # 15 days

# Enable third parties caching (e.g LangChain cache)
cache = false

# Authorized origins
allow_origins = ["*"]

[features]
# Process and display HTML in messages. This can be a security risk (see https://stackoverflow.com/questions/19603097/why-is-it-dangerous-to-render-user-generated-html-or-javascript)
unsafe_allow_html = false

# Process and display mathematical expressions. This can clash with "$" characters in messages.
latex = false

# Automatically tag threads with the current chat profile (if a chat profile is used)
auto_tag_thread = true

# Allow users to edit their own messages
edit_message = true

# Authorize users to spontaneously upload files with messages
[features.spontaneous_file_upload]
    enabled = true
    accept = ["*/*"]
    max_files = 20
    max_size_mb = 500

[features.audio]
    # Sample rate of the audio
    sample_rate = 24000

[UI]
# Name of the assistant.
name = "VHAL_Assistant"

# default_theme = "dark"

# layout = "wide"

# Description of the assistant. This is used for HTML tags.
# description = ""

# Chain of Thought (CoT) display mode. Can be "hidden", "tool_call" or "full".
cot = "full"

# Link to your github repo. This will add a github button in the UI's header.
# github = ""

# Specify a CSS file that can be used to customize the user interface.
# The CSS file can be served from the public directory or via an external link.
# custom_css = "/public/test.css"

# Specify a Javascript file that can be used to customize the user interface.
# The Javascript file can be served from the public directory.
# custom_js = "/public/test.js"

# Specify a custom meta image url.
# custom_meta_image_url = "https://chainlit-cloud.s3.eu-west-3.amazonaws.com/logo/chainlit_banner.png"

# Specify a custom build directory for the frontend.
# This can be used to customize the frontend code.
# Be careful: If this is a relative path, it should not start with a slash.
# custom_build = "./public/build"

[meta]
generated_by = "2.0.1"
  1. Copy and run the official example code as follows:
from langchain_openai import ChatOpenAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema import StrOutputParser
from langchain.schema.runnable import Runnable
from langchain.schema.runnable.config import RunnableConfig
from typing import cast

import chainlit as cl


@cl.on_chat_start
async def on_chat_start():
    model = ChatOpenAI(streaming=True)
    prompt = ChatPromptTemplate.from_messages(
        [
            (
                "system",
                "You're a very knowledgeable historian who provides accurate and eloquent answers to historical questions.",
            ),
            ("human", "{question}"),
        ]
    )
    runnable = prompt | model | StrOutputParser()
    cl.user_session.set("runnable", runnable)


@cl.on_message
async def on_message(message: cl.Message):
    runnable = cast(Runnable, cl.user_session.get("runnable"))  # type: Runnable

    msg = cl.Message(content="")

    async for chunk in runnable.astream(
        {"question": message.content},
        config=RunnableConfig(callbacks=[cl.LangchainCallbackHandler()]),
    ):
        await msg.stream_token(chunk)

    await msg.send()
  1. However, the ChainLit conversation interface does not display LangChain's CoT as shown in the official example. The following image shows the state of my execution:
    image

Expected behavior
The old version of ChainLit can display the Langchain CoT informations as below:

image

Copy link

dosubot bot commented Jan 13, 2025

I found 1 similar issue that might be helpful: Chainlit CoT not displayed in version 2.0.1. The suggested solution is that hide_cot has been replaced with cot, which relates to your issue of the Langchain CoT process not being displayed in the current version.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the bug Something isn't working label Jan 13, 2025
@HwJhx HwJhx closed this as completed Jan 13, 2025
@HwJhx HwJhx reopened this Jan 13, 2025
@HwJhx
Copy link
Author

HwJhx commented Jan 13, 2025

I found 1 similar issue that might be helpful: Chainlit CoT not displayed in version 2.0.1. The suggested solution is that hide_cot has been replaced with cot, which relates to your issue of the Langchain CoT process not being displayed in the current version.

To continue talking to Dosu, mention @dosu.

Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

Thank you for pointing that out. I’ve already noticed this change, and the bug I’m reporting specifically occurs when the cot setting is set to "full". The Langchain CoT process still does not display as expected in this scenario.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working needs-triage
Projects
None yet
Development

No branches or pull requests

1 participant