1. 这是什么?

这是langchain的Java语言实现。大型语言模型(LLM)正成为一种变革性的技术,使开发人员能够构建以前无法构建的应用程序。但是,单个的使用LLM往往不足以创建真正强大的应用程序,当我们能够将它们与其他计算或知识来源相结合时,才会展现出真正的威力。这个库旨在协助开发这类应用程序。

2. 快速入门指南

2.1 Maven存储库

先决条件:

1.Java 17或更高版本

2.类Unix环境(我们使用Linux,Mac OS X)

3.Maven(我们推荐版本3.8.6,至少需要3.5.4)

 io.github.hamawhitegglangchain-core0.1.12 

2.2扩展 lang-chain框架支持

使用 LangChain 通常需要与一个或多个模型提供程序、数据存储、API 等集成。在本例中,我们将扩展lang-chain框架支持文心一言的API

2.2.1继承BaseLLM
@SuperBuilderpublic class WenXinQianFan extends BaseLLM {private static final Logger LOG = LoggerFactory.getLogger(WenXinQianFan.class);/** * Endpoint URL to use. */@Builder.Defaultprivate String endpointUrl = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/completions?access_token=";/** * Max token allowed to pass to the model. */@Builder.Defaultprivate int maxToken = 4096;/** * LLM model temperature from 0 to 10. */@Builder.Defaultprivate float temperature = 0.95f;/** * History of the conversation */@Builder.Defaultprivate List messages = new ArrayList();/** * Top P for nucleus sampling from 0 to 1 */@Builder.Defaultprivate float topP = 1.0f;@Builder.Defaultprivate float penaltyScore=1.5f;@Builder.Defaultprivate boolean stream = false;@Builder.Defaultprivate String accessToken="";/** * Whether to use history or not */private boolean withHistory;private TextRequestsWrapper requestsWrapper;public WenXinQianFan init() {Map headers = Map.of("Content-Type", "application/json");this.requestsWrapper = new TextRequestsWrapper(headers);return this;}@Overridepublic String llmType() {return "wenxin";}public List createStream(String prompt, List stop) {JSONObject jsonObject = new JSONObject();jsonObject.put("content",prompt);jsonObject.put("role","user");messages.add(jsonObject);Map payload = Map.of("temperature", temperature,"messages", messages,"max_length", maxToken,"stream",stream,"top_p", topP);LOG.debug("WenXin payload: {}", payload);String response = requestsWrapper.post(endpointUrl+this.accessToken, payload);LOG.debug("WenXin response: {}", response);return response.lines().toList();}@Overrideprotected LLMResult innerGenerate(List prompts, List stop) {List<List> generations = new ArrayList();for (String prompt : prompts) {GenerationChunk finalChunk = null;for (String streamResp : createStream(prompt, stop)) {if (StringUtils.isNotEmpty(streamResp)) {GenerationChunk chunk = streamResponseToGenerationChunk(streamResp);if (finalChunk == null) {finalChunk = chunk;} else {finalChunk = finalChunk.add(chunk);}}}generations.add(List.of(requireNonNull(finalChunk)));}return new LLMResult(generations);}public static GenerationChunk streamResponseToGenerationChunk(String streamResponse) {String replace = streamResponse.replace("data:", "");Map parsedResponse = JsonUtils.convertFromJsonStr(replace, new TypeReference() {});String text = (String) parsedResponse.getOrDefault("result", "");return new GenerationChunk(text, null);}}
2.2.2 测试使用

2.2.3 请求结果