Skip to content

Commit 53f1cf9

Browse files
beklapkolnhsingh
andauthored
Brody/doc 60 port integrations docs (#153)
Current preview: https://langchain-5e9cc07a-preview-brodyd-1755879478-9f01bc4.mintlify.app/oss/python/integrations/providers/anthropic Related task: https://linear.app/langchain/issue/DOC-60/port-integrations-docs ## What this PR does - Using our docs CLI, migrates everything in this folder https://github.com/langchain-ai/langchain/tree/master/docs/docs/integrations to this new repo - Updates a bunch of titles - Fixes a bunch of rendering/syntax issues - Replaces some notebook references since those files were turned into .md files - Ports tables and auto-gen docs lists/cards to Card components ## What this PR does not do - Adds the Integration section of the docs to the nav - Fix all the broken links - Fix every rendering issue - Fix every title - Finalize the sidebar - Finalize table appearance and other theme settings - Set up the contributing page --------- Co-authored-by: Lauren Hirata Singh <[email protected]>
1 parent 9e54192 commit 53f1cf9

File tree

1,298 files changed

+184485
-0
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,298 files changed

+184485
-0
lines changed

src/oss/images/base-chat-ui.mp4

97.9 KB
Binary file not shown.

src/oss/images/interrupt-chat-ui.mp4

235 KB
Binary file not shown.
Lines changed: 138 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,138 @@
1+
---
2+
title: OpenAI Adapter(Old)
3+
---
4+
5+
**Please ensure OpenAI library is less than 1.0.0; otherwise, refer to the newer doc [OpenAI Adapter](/oss/integrations/adapters/openai/).**
6+
7+
A lot of people get started with OpenAI but want to explore other models. LangChain's integrations with many model providers make this easy to do so. While LangChain has it's own message and model APIs, we've also made it as easy as possible to explore other models by exposing an adapter to adapt LangChain models to the OpenAI api.
8+
9+
At the moment this only deals with output and does not return other information (token counts, stop reasons, etc).
10+
11+
12+
```python
13+
import openai
14+
from langchain_community.adapters import openai as lc_openai
15+
```
16+
17+
## ChatCompletion.create
18+
19+
20+
```python
21+
messages = [{"role": "user", "content": "hi"}]
22+
```
23+
24+
Original OpenAI call
25+
26+
27+
```python
28+
result = openai.ChatCompletion.create(
29+
messages=messages, model="gpt-3.5-turbo", temperature=0
30+
)
31+
result["choices"][0]["message"].to_dict_recursive()
32+
```
33+
34+
35+
36+
```output
37+
{'role': 'assistant', 'content': 'Hello! How can I assist you today?'}
38+
```
39+
40+
41+
LangChain OpenAI wrapper call
42+
43+
44+
```python
45+
lc_result = lc_openai.ChatCompletion.create(
46+
messages=messages, model="gpt-3.5-turbo", temperature=0
47+
)
48+
lc_result["choices"][0]["message"]
49+
```
50+
51+
52+
53+
```output
54+
{'role': 'assistant', 'content': 'Hello! How can I assist you today?'}
55+
```
56+
57+
58+
Swapping out model providers
59+
60+
61+
```python
62+
lc_result = lc_openai.ChatCompletion.create(
63+
messages=messages, model="claude-2", temperature=0, provider="ChatAnthropic"
64+
)
65+
lc_result["choices"][0]["message"]
66+
```
67+
68+
69+
70+
```output
71+
{'role': 'assistant', 'content': ' Hello!'}
72+
```
73+
74+
75+
## ChatCompletion.stream
76+
77+
Original OpenAI call
78+
79+
80+
```python
81+
for c in openai.ChatCompletion.create(
82+
messages=messages, model="gpt-3.5-turbo", temperature=0, stream=True
83+
):
84+
print(c["choices"][0]["delta"].to_dict_recursive())
85+
```
86+
```output
87+
{'role': 'assistant', 'content': ''}
88+
{'content': 'Hello'}
89+
{'content': '!'}
90+
{'content': ' How'}
91+
{'content': ' can'}
92+
{'content': ' I'}
93+
{'content': ' assist'}
94+
{'content': ' you'}
95+
{'content': ' today'}
96+
{'content': '?'}
97+
{}
98+
```
99+
LangChain OpenAI wrapper call
100+
101+
102+
```python
103+
for c in lc_openai.ChatCompletion.create(
104+
messages=messages, model="gpt-3.5-turbo", temperature=0, stream=True
105+
):
106+
print(c["choices"][0]["delta"])
107+
```
108+
```output
109+
{'role': 'assistant', 'content': ''}
110+
{'content': 'Hello'}
111+
{'content': '!'}
112+
{'content': ' How'}
113+
{'content': ' can'}
114+
{'content': ' I'}
115+
{'content': ' assist'}
116+
{'content': ' you'}
117+
{'content': ' today'}
118+
{'content': '?'}
119+
{}
120+
```
121+
Swapping out model providers
122+
123+
124+
```python
125+
for c in lc_openai.ChatCompletion.create(
126+
messages=messages,
127+
model="claude-2",
128+
temperature=0,
129+
stream=True,
130+
provider="ChatAnthropic",
131+
):
132+
print(c["choices"][0]["delta"])
133+
```
134+
```output
135+
{'role': 'assistant', 'content': ' Hello'}
136+
{'content': '!'}
137+
{}
138+
```
Lines changed: 162 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,162 @@
1+
---
2+
title: OpenAI Adapter
3+
---
4+
5+
**Please ensure OpenAI library is version 1.0.0 or higher; otherwise, refer to the older doc [OpenAI Adapter(Old)](/oss/integrations/adapters/openai-old/).**
6+
7+
A lot of people get started with OpenAI but want to explore other models. LangChain's integrations with many model providers make this easy to do so. While LangChain has it's own message and model APIs, we've also made it as easy as possible to explore other models by exposing an adapter to adapt LangChain models to the OpenAI api.
8+
9+
At the moment this only deals with output and does not return other information (token counts, stop reasons, etc).
10+
11+
12+
```python
13+
import openai
14+
from langchain_community.adapters import openai as lc_openai
15+
```
16+
17+
## chat.completions.create
18+
19+
20+
```python
21+
messages = [{"role": "user", "content": "hi"}]
22+
```
23+
24+
Original OpenAI call
25+
26+
27+
```python
28+
result = openai.chat.completions.create(
29+
messages=messages, model="gpt-3.5-turbo", temperature=0
30+
)
31+
result.choices[0].message.model_dump()
32+
```
33+
34+
35+
36+
```output
37+
{'content': 'Hello! How can I assist you today?',
38+
'role': 'assistant',
39+
'function_call': None,
40+
'tool_calls': None}
41+
```
42+
43+
44+
LangChain OpenAI wrapper call
45+
46+
47+
```python
48+
lc_result = lc_openai.chat.completions.create(
49+
messages=messages, model="gpt-3.5-turbo", temperature=0
50+
)
51+
52+
lc_result.choices[0].message # Attribute access
53+
```
54+
55+
56+
57+
```output
58+
{'role': 'assistant', 'content': 'Hello! How can I help you today?'}
59+
```
60+
61+
62+
63+
```python
64+
lc_result["choices"][0]["message"] # Also compatible with index access
65+
```
66+
67+
68+
69+
```output
70+
{'role': 'assistant', 'content': 'Hello! How can I help you today?'}
71+
```
72+
73+
74+
Swapping out model providers
75+
76+
77+
```python
78+
lc_result = lc_openai.chat.completions.create(
79+
messages=messages, model="claude-2", temperature=0, provider="ChatAnthropic"
80+
)
81+
lc_result.choices[0].message
82+
```
83+
84+
85+
86+
```output
87+
{'role': 'assistant', 'content': 'Hello! How can I assist you today?'}
88+
```
89+
90+
91+
## chat.completions.stream
92+
93+
Original OpenAI call
94+
95+
96+
```python
97+
for c in openai.chat.completions.create(
98+
messages=messages, model="gpt-3.5-turbo", temperature=0, stream=True
99+
):
100+
print(c.choices[0].delta.model_dump())
101+
```
102+
```output
103+
{'content': '', 'function_call': None, 'role': 'assistant', 'tool_calls': None}
104+
{'content': 'Hello', 'function_call': None, 'role': None, 'tool_calls': None}
105+
{'content': '!', 'function_call': None, 'role': None, 'tool_calls': None}
106+
{'content': ' How', 'function_call': None, 'role': None, 'tool_calls': None}
107+
{'content': ' can', 'function_call': None, 'role': None, 'tool_calls': None}
108+
{'content': ' I', 'function_call': None, 'role': None, 'tool_calls': None}
109+
{'content': ' assist', 'function_call': None, 'role': None, 'tool_calls': None}
110+
{'content': ' you', 'function_call': None, 'role': None, 'tool_calls': None}
111+
{'content': ' today', 'function_call': None, 'role': None, 'tool_calls': None}
112+
{'content': '?', 'function_call': None, 'role': None, 'tool_calls': None}
113+
{'content': None, 'function_call': None, 'role': None, 'tool_calls': None}
114+
```
115+
LangChain OpenAI wrapper call
116+
117+
118+
```python
119+
for c in lc_openai.chat.completions.create(
120+
messages=messages, model="gpt-3.5-turbo", temperature=0, stream=True
121+
):
122+
print(c.choices[0].delta)
123+
```
124+
```output
125+
{'role': 'assistant', 'content': ''}
126+
{'content': 'Hello'}
127+
{'content': '!'}
128+
{'content': ' How'}
129+
{'content': ' can'}
130+
{'content': ' I'}
131+
{'content': ' assist'}
132+
{'content': ' you'}
133+
{'content': ' today'}
134+
{'content': '?'}
135+
{}
136+
```
137+
Swapping out model providers
138+
139+
140+
```python
141+
for c in lc_openai.chat.completions.create(
142+
messages=messages,
143+
model="claude-2",
144+
temperature=0,
145+
stream=True,
146+
provider="ChatAnthropic",
147+
):
148+
print(c["choices"][0]["delta"])
149+
```
150+
```output
151+
{'role': 'assistant', 'content': ''}
152+
{'content': 'Hello'}
153+
{'content': '!'}
154+
{'content': ' How'}
155+
{'content': ' can'}
156+
{'content': ' I'}
157+
{'content': ' assist'}
158+
{'content': ' you'}
159+
{'content': ' today'}
160+
{'content': '?'}
161+
{}
162+
```

0 commit comments

Comments
 (0)