LLM Chat

AI Assistant

Hello! I'm your AI assistant. How can I help you today? You can ask me anything from coding help to creative writing.

You

Can you explain how to create a Streamlit app for LLMs?

AI Assistant

Certainly! Here's a basic structure for a Streamlit app that interacts with an LLM:

import streamlit as st
from openai import OpenAI

# Set up the page
st.set_page_config(page_title="LLM Chat App", page_icon="🤖")

# Initialize the OpenAI client
client = OpenAI(api_key=st.secrets["OPENAI_API_KEY"])

# Sidebar for settings
with st.sidebar:
    st.title("LLM Settings")
    model = st.selectbox("Choose a model", ["gpt-4", "gpt-3.5-turbo"])
    temperature = st.slider("Temperature", 0.0, 1.0, 0.7)

# Main chat interface
st.title("💬 LLM Chat")
if "messages" not in st.session_state:
    st.session_state.messages = []

# Display chat messages
for message in st.session_state.messages:
    with st.chat_message(message["role"]):
        st.markdown(message["content"])

# Chat input
if prompt := st.chat_input("What would you like to ask?"):
    st.session_state.messages.append({"role": "user", "content": prompt})
    with st.chat_message("user"):
        st.markdown(prompt)
    
    with st.chat_message("assistant"):
        response = client.chat.completions.create(
            model=model,
            messages=st.session_state.messages,
            temperature=temperature
        )
        reply = response.choices[0].message.content
        st.markdown(reply)
    
    st.session_state.messages.append({"role": "assistant", "content": reply})

To run this, you'll need to install Streamlit (pip install streamlit) and the OpenAI package (pip install openai). Then save this as app.py and run with streamlit run app.py.

Model: GPT-4

Made with DeepSite LogoDeepSite - 🧬 Remix