AI创想

标题: LangChain Cookbook Part 1 [打印本页]

作者: tianjinji    时间: 昨天 14:11
标题: LangChain Cookbook Part 1
作者:CSDN博客
参考自https://github.com/gkamradt/langchain-tutorials/blob/main/LangChain%20Cookbook%20Part%201%20-%20Fundamentals.ipynb
LangChain食谱-1

这个文档基于LangChain Conceptual Documentation
目标是介绍LangChain组件和用例
什么是LangChain?

LangChain是一个由语言模型驱动的应用程序框架。
LangChain让使用AI模型进行工作和构建的复杂部分变得更容易,它通过两种方式帮助实现这一点:
为什么选择LangChain?

  1. import os
  2. from dotenv import load_dotenv, find_dotenv
  3. _ = load_dotenv(find_dotenv())
复制代码
LangChain Components(LangChain组件)

Schema(模式)-使用大型语言模型的螺母和螺栓(Nuts and Bolts of working with Large Language Models (LLMs))

Text

与LLMs进行交互的自然语言方式。
  1. # You'll be working with simple strings (that'll soon grow in complexity!)
  2. my_text ="What day comes after Friday?"
  3. my_text
复制代码
  1. 'What day comes after Friday?'
复制代码
Chat Messages

类似文本,但指定了消息类型(System, Human, AI)
  1. from langchain_openai import ChatOpenAI
  2. from langchain.schema import HumanMessage, SystemMessage, AIMessage
  3. # This it the language model we'll use. We'll talk about what we're doing below in the next section
  4. chat = ChatOpenAI(temperature=.7)
复制代码
现在让我们创建一些信息来模拟使用机器人的聊天体验
  1. chat.invoke([
  2.         SystemMessage(content="You are a nice AI bot that helps a user figure out what to eat in one short sentence"),
  3.         HumanMessage(content="I like tomatoes, what should I eat?")]).content
复制代码
  1. 'You might enjoy a caprese salad with fresh tomatoes, mozzarella, basil, and balsamic glaze.'
复制代码
你还可以传递更多和AI聊天的历史
  1. chat.invoke([
  2.         SystemMessage(content="You are a nice AI bot that helps a user figure out where to travel in one short sentence"),
  3.         HumanMessage(content="I like the beaches where should I go?"),
  4.         AIMessage(content="You should go to Nice, France"),
  5.         HumanMessage(content="What else should I do when I'm there?")]).content
复制代码
  1. 'Explore the charming Old Town and enjoy the vibrant local markets in Nice, France.'
复制代码
你还可以排除系统信息,如果需要的话
  1. chat.invoke([
  2.         HumanMessage(content="What day comes after Thursday?")]).content
复制代码
  1. 'Friday.'
复制代码
Documents

保存一段文本和元数据(有关该文本的更多信息)的对象
  1. from langchain.schema import Document
复制代码
  1. Document(page_content="This is my document. It is full of text that I've gathered from other places",
  2.          metadata={
  3.    
  4.    'my_document_id':234234,'my_document_source':"The LangChain Papers",'my_document_create_time':1680013019})
复制代码
  1. Document(metadata={'my_document_id': 234234, 'my_document_source': 'The LangChain Papers', 'my_document_create_time': 1680013019}, page_content="This is my document. It is full of text that I've gathered from other places")
复制代码
但如果你不想,你不必包含元数据
  1. Document(page_content="This is my document. It is full of text that I've gathered from other places")
复制代码
  1. Document(page_content="This is my document. It is full of text that I've gathered from other places")
复制代码
Models(模型)-人工智能大脑的接口(The interface to the AI brains)

Language Model

一个可以输入文本➡️输出文本的模型!
Chat Model

接收一系列消息并返回消息输出的模型
  1. from langchain.chat_models import ChatOpenAI
  2. from langchain.schema import HumanMessage, SystemMessage, AIMessage
  3. chat = ChatOpenAI(temperature=1)
复制代码
  1. chat.invoke([
  2.         SystemMessage(content="You are an unhelpful AI bot that makes a joke at whatever the user says"),
  3.         HumanMessage(content="I would like to go to New York, how should I do this?")])
复制代码
  1. AIMessage(content='Why did the scarecrow win an award? Because he was outstanding in his field!', response_metadata={'token_usage': {'completion_tokens': 17, 'prompt_tokens': 43, 'total_tokens': 60}, 'model_name': 'gpt-3.5-turbo', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None}, id='run-eaef1b5e-de25-4df3-ab6a-a52fe3bca608-0')
复制代码
Function Calling Model

函数调用模型与聊天模型类似,但略有不同。它们经过微调,可提供结构化数据输出。
当您对外部服务进行 API 调用或进行提取时,这会非常有用。
  1. chat = ChatOpenAI(model='gpt-3.5-turbo-0613', temperature=1)
  2. output = chat(messages=[
  3.          SystemMessage(content="You are an helpful AI bot"),
  4.          HumanMessage(content="What’s the weather like in Boston right now?")],
  5.      functions=[{
  6.    
  7.    "name":"get_current_weather","description":"Get the current weather in a given location","parameters":{
  8.    
  9.    "type":"object","properties":{
  10.    
  11.    "location":{
  12.    
  13.    "type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{
  14.    
  15.    "type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}])
  16. output
复制代码
  1. AIMessage(content='', additional_kwargs={'function_call': {'arguments': '{\n  "location": "Boston"\n}', 'name': 'get_current_weather'}}, response_metadata={'token_usage': {'completion_tokens': 16, 'prompt_tokens': 91, 'total_tokens': 107}, 'model_name': 'gpt-3.5-turbo-0613', 'system_fingerprint': None, 'finish_reason': 'function_call', 'logprobs': None}, id='run-4ff96b3c-ec67-4d49-86f1-512046eaf4f4-0')
复制代码
看到传回给我们的额外 additional_kwargs 了吗?我们可以将其传递给外部 API 以获取数据。这省去了进行输出解析的麻烦。
Text Embedding Model

将文本转换为向量(一系列包含文本语义“含义”的数字)。主要用于比较两段文本。
顺便说一句:语义的意思是“与语言或逻辑中的含义相关”。
  1. from langchain_openai import OpenAIEmbeddings
  2. embeddings = OpenAIEmbeddings()
复制代码
  1. text ="Hi! It's time for the beach"
复制代码
  1. text_embedding = embeddings.embed_query(text)print(f"Here's a sample: {
  2.      
  3.      text_embedding[
复制代码
原文地址:https://blog.csdn.net/a_blade_of_grass/article/details/140371277




欢迎光临 AI创想 (https://llms-ai.com/) Powered by Discuz! X3.4