快速入门指南:如何在华为云平台modelarts 快速使用 langchain 构建端到端语言模型应用程序-4008云顶国际网站
【摘要】 前期准备:1.登录华为云官方账号:点击右上角“控制台”,搜索栏输入“modelarts”点击“开发环境”-“notebook”,“创建”:进入创建notebook,名称“notebook-langchain”,选择gpu规格,“gpu: 1*t4(16gb)|cpu: 8核 32gb”,点击“立即创建”,磁盘规格选择“50g”,点击“创建”点击返回“任务中心”,点击notebook进入1. ...
1.登录:
点击右上角“控制台”,搜索栏输入“modelarts”
点击“开发环境”-“notebook”,“创建”:
进入创建notebook,名称“notebook-langchain”,选择gpu规格,“gpu: 1*t4(16gb)|cpu: 8核 32gb”,点击“立即创建”,磁盘规格选择“50g”,点击“创建”
点击返回“任务中心”,点击notebook进入
首先,安装 langchain。只需运行以下命令:
!pip install langchain
looking in indexes: http://repo.myhuaweicloud.com/repository/pypi/simple
collecting langchain
downloading http://repo.myhuaweicloud.com/repository/pypi/packages/1f/fd/f2aa39f8e63a6fbacf2e7be820b846c27b1e5830af9c2e2e208801b6c07f/langchain-0.0.27-py3-none-any.whl (124 kb)
|████████████████████████████████| 124 kb 49.0 mb/s eta 0:00:01
collecting sqlalchemy
downloading http://repo.myhuaweicloud.com/repository/pypi/packages/ac/d8/51e617a1eb143a48ab2dceb194afe40b3c42b785723a031cc29a8c04103d/sqlalchemy-2.0.20-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.9 mb)
|████████████████████████████████| 2.9 mb 30.8 mb/s eta 0:00:01
requirement already satisfied: numpy in /home/ma-user/anaconda3/envs/pytorch-1.8/lib/python3.7/site-packages (from langchain) (1.19.5)
requirement already satisfied: pyyaml in /home/ma-user/anaconda3/envs/pytorch-1.8/lib/python3.7/site-packages (from langchain) (5.1)
requirement already satisfied: requests in /home/ma-user/anaconda3/envs/pytorch-1.8/lib/python3.7/site-packages (from langchain) (2.27.1)
collecting pydantic
downloading http://repo.myhuaweicloud.com/repository/pypi/packages/fd/35/86b1e7571e695587df0ddf2937100436dce0caa277d2f016d4e4f7d3791a/pydantic-2.2.1-py3-none-any.whl (373 kb)
|████████████████████████████████| 373 kb 55.0 mb/s eta 0:00:01
collecting typing-extensions>=4.6.1
downloading http://repo.myhuaweicloud.com/repository/pypi/packages/38/60/300ad6f93adca578bf05d5f6cd1d854b7d140bebe2f9829561aa9977d9f3/typing_extensions-4.6.2-py3-none-any.whl (31 kb)
collecting pydantic-core==2.6.1
downloading http://repo.myhuaweicloud.com/repository/pypi/packages/c0/ca/4cf24afe80f5839a5cad5e35e2a0a11fe41b0f4f6a544109f73337567579/pydantic_core-2.6.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.9 mb)
|████████████████████████████████| 1.9 mb 38.3 mb/s eta 0:00:01
collecting annotated-types>=0.4.0
downloading http://repo.myhuaweicloud.com/repository/pypi/packages/d8/f0/a2ee543a96cc624c35a9086f39b1ed2aa403c6d355dfe47a11ee5c64a164/annotated_types-0.5.0-py3-none-any.whl (11 kb)
requirement already satisfied: urllib3<1.27,>=1.21.1 in /home/ma-user/anaconda3/envs/pytorch-1.8/lib/python3.7/site-packages (from requests->langchain) (1.26.12)
requirement already satisfied: certifi>=2017.4.17 in /home/ma-user/anaconda3/envs/pytorch-1.8/lib/python3.7/site-packages (from requests->langchain) (2022.9.24)
requirement already satisfied: charset-normalizer~=2.0.0 in /home/ma-user/anaconda3/envs/pytorch-1.8/lib/python3.7/site-packages (from requests->langchain) (2.0.12)
requirement already satisfied: idna<4,>=2.5 in /home/ma-user/anaconda3/envs/pytorch-1.8/lib/python3.7/site-packages (from requests->langchain) (3.4)
collecting greenlet!=0.4.17
downloading http://repo.myhuaweicloud.com/repository/pypi/packages/1a/ed/72998fb3609f6c4b0817df32e2b98a88bb8510613d12d495bbab8534ebd0/greenlet-2.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (514 kb)
|████████████████████████████████| 514 kb 12.0 mb/s eta 0:00:01
import os
os .environ[ "openai_api_key" ] = "..."
构建语言模型应用程序:llm
安装好 langchain 并设置好环境后,我们就可以开始构建我们的语言模型应用程序了。langchain 提供了一堆模块,您可以使用它们来创建语言模型应用程序。您可以将这些模块组合起来用于更复杂的应用程序,或者将它们单独用于更简单的应用程序。构建语言模型应用程序:chat model
除了 llm,您还可以使用聊天模型。这些是语言模型的变体,它们在底层使用语言模型但具有不同的界面。聊天模型使用聊天消息作为输入和输出,而不是“文本输入、文本输出”api。聊天模型 api 的使用还比较新,所以大家都还在寻找最佳抽象使用方式。
要完成聊天,您需要将一条或多条消息传递给聊天模型。langchain 目前支持 aimessage、humanmessage、systemmessage 和 chatmessage 类型。您将主要使用 humanmessage、aimessage 和 systemmessage。 下面是使用聊天模型的示例:
from langchain.chat_models import chatopenai
from langchain.schema import (
aimessage,
humanmessage,
systemmessage
)
chat = chatopenai(temperature=0)
您可以通过传递一条消息来完成:
chat([humanmessage(content="translate this sentence from english to french. i love programming.")])
# -> aimessage(content="j'aime programmer.", additional_kwargs={})
【4008云顶国际集团的版权声明】本文为华为云社区用户原创内容,未经允许不得转载,如需转载请自行联系原作者进行授权。如果您发现本社区中有涉嫌抄袭的内容,欢迎发送邮件进行举报,并提供相关证据,一经查实,本社区将立刻删除涉嫌侵权内容,举报邮箱:
- 点赞
- 收藏
- 关注作者
评论(0)