about summary refs log tree commit diff
path: root/.venv/lib/python3.12/site-packages/anthropic-0.49.0.dist-info/METADATA
diff options
context:
space:
mode:
Diffstat (limited to '.venv/lib/python3.12/site-packages/anthropic-0.49.0.dist-info/METADATA')
-rw-r--r--.venv/lib/python3.12/site-packages/anthropic-0.49.0.dist-info/METADATA780
1 files changed, 780 insertions, 0 deletions
diff --git a/.venv/lib/python3.12/site-packages/anthropic-0.49.0.dist-info/METADATA b/.venv/lib/python3.12/site-packages/anthropic-0.49.0.dist-info/METADATA
new file mode 100644
index 00000000..8eccccf3
--- /dev/null
+++ b/.venv/lib/python3.12/site-packages/anthropic-0.49.0.dist-info/METADATA
@@ -0,0 +1,780 @@
+Metadata-Version: 2.4
+Name: anthropic
+Version: 0.49.0
+Summary: The official Python library for the anthropic API
+Project-URL: Homepage, https://github.com/anthropics/anthropic-sdk-python
+Project-URL: Repository, https://github.com/anthropics/anthropic-sdk-python
+Author-email: Anthropic <support@anthropic.com>
+License-Expression: MIT
+License-File: LICENSE
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: MIT License
+Classifier: Operating System :: MacOS
+Classifier: Operating System :: Microsoft :: Windows
+Classifier: Operating System :: OS Independent
+Classifier: Operating System :: POSIX
+Classifier: Operating System :: POSIX :: Linux
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: 3.10
+Classifier: Programming Language :: Python :: 3.11
+Classifier: Programming Language :: Python :: 3.12
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Classifier: Typing :: Typed
+Requires-Python: >=3.8
+Requires-Dist: anyio<5,>=3.5.0
+Requires-Dist: distro<2,>=1.7.0
+Requires-Dist: httpx<1,>=0.23.0
+Requires-Dist: jiter<1,>=0.4.0
+Requires-Dist: pydantic<3,>=1.9.0
+Requires-Dist: sniffio
+Requires-Dist: typing-extensions<5,>=4.10
+Provides-Extra: bedrock
+Requires-Dist: boto3>=1.28.57; extra == 'bedrock'
+Requires-Dist: botocore>=1.31.57; extra == 'bedrock'
+Provides-Extra: vertex
+Requires-Dist: google-auth<3,>=2; extra == 'vertex'
+Description-Content-Type: text/markdown
+
+# Anthropic Python API library
+
+[![PyPI version](https://img.shields.io/pypi/v/anthropic.svg)](https://pypi.org/project/anthropic/)
+
+The Anthropic Python library provides convenient access to the Anthropic REST API from any Python 3.8+
+application. It includes type definitions for all request params and response fields,
+and offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx).
+
+## Documentation
+
+The REST API documentation can be found on [docs.anthropic.com](https://docs.anthropic.com/claude/reference/). The full API of this library can be found in [api.md](https://github.com/anthropics/anthropic-sdk-python/tree/main/api.md).
+
+## Installation
+
+```sh
+# install from PyPI
+pip install anthropic
+```
+
+## Usage
+
+The full API of this library can be found in [api.md](https://github.com/anthropics/anthropic-sdk-python/tree/main/api.md).
+
+```python
+import os
+from anthropic import Anthropic
+
+client = Anthropic(
+    api_key=os.environ.get("ANTHROPIC_API_KEY"),  # This is the default and can be omitted
+)
+
+message = client.messages.create(
+    max_tokens=1024,
+    messages=[
+        {
+            "role": "user",
+            "content": "Hello, Claude",
+        }
+    ],
+    model="claude-3-5-sonnet-latest",
+)
+print(message.content)
+```
+
+While you can provide an `api_key` keyword argument,
+we recommend using [python-dotenv](https://pypi.org/project/python-dotenv/)
+to add `ANTHROPIC_API_KEY="my-anthropic-api-key"` to your `.env` file
+so that your API Key is not stored in source control.
+
+## Async usage
+
+Simply import `AsyncAnthropic` instead of `Anthropic` and use `await` with each API call:
+
+```python
+import os
+import asyncio
+from anthropic import AsyncAnthropic
+
+client = AsyncAnthropic(
+    api_key=os.environ.get("ANTHROPIC_API_KEY"),  # This is the default and can be omitted
+)
+
+
+async def main() -> None:
+    message = await client.messages.create(
+        max_tokens=1024,
+        messages=[
+            {
+                "role": "user",
+                "content": "Hello, Claude",
+            }
+        ],
+        model="claude-3-5-sonnet-latest",
+    )
+    print(message.content)
+
+
+asyncio.run(main())
+```
+
+Functionality between the synchronous and asynchronous clients is otherwise identical.
+
+## Streaming responses
+
+We provide support for streaming responses using Server Side Events (SSE).
+
+```python
+from anthropic import Anthropic
+
+client = Anthropic()
+
+stream = client.messages.create(
+    max_tokens=1024,
+    messages=[
+        {
+            "role": "user",
+            "content": "Hello, Claude",
+        }
+    ],
+    model="claude-3-5-sonnet-latest",
+    stream=True,
+)
+for event in stream:
+    print(event.type)
+```
+
+The async client uses the exact same interface.
+
+```python
+from anthropic import AsyncAnthropic
+
+client = AsyncAnthropic()
+
+stream = await client.messages.create(
+    max_tokens=1024,
+    messages=[
+        {
+            "role": "user",
+            "content": "Hello, Claude",
+        }
+    ],
+    model="claude-3-5-sonnet-latest",
+    stream=True,
+)
+async for event in stream:
+    print(event.type)
+```
+
+### Streaming Helpers
+
+This library provides several conveniences for streaming messages, for example:
+
+```py
+import asyncio
+from anthropic import AsyncAnthropic
+
+client = AsyncAnthropic()
+
+async def main() -> None:
+    async with client.messages.stream(
+        max_tokens=1024,
+        messages=[
+            {
+                "role": "user",
+                "content": "Say hello there!",
+            }
+        ],
+        model="claude-3-5-sonnet-latest",
+    ) as stream:
+        async for text in stream.text_stream:
+            print(text, end="", flush=True)
+        print()
+
+    message = await stream.get_final_message()
+    print(message.to_json())
+
+asyncio.run(main())
+```
+
+Streaming with `client.messages.stream(...)` exposes [various helpers for your convenience](https://github.com/anthropics/anthropic-sdk-python/tree/main/helpers.md) including accumulation & SDK-specific events.
+
+Alternatively, you can use `client.messages.create(..., stream=True)` which only returns an async iterable of the events in the stream and thus uses less memory (it does not build up a final message object for you).
+
+## Token counting
+
+To get the token count for a message without creating it you can use the `client.beta.messages.count_tokens()` method. This takes the same `messages` list as the `.create()` method.
+
+```py
+count = client.beta.messages.count_tokens(
+    model="claude-3-5-sonnet-20241022",
+    messages=[
+        {"role": "user", "content": "Hello, world"}
+    ]
+)
+count.input_tokens  # 10
+```
+
+You can also see the exact usage for a given request through the `usage` response property, e.g.
+
+```py
+message = client.messages.create(...)
+message.usage
+# Usage(input_tokens=25, output_tokens=13)
+```
+
+## Message Batches
+
+This SDK provides beta support for the [Message Batches API](https://docs.anthropic.com/en/docs/build-with-claude/message-batches) under the `client.beta.messages.batches` namespace.
+
+
+### Creating a batch
+
+Message Batches take the exact same request params as the standard Messages API:
+
+```python
+await client.beta.messages.batches.create(
+    requests=[
+        {
+            "custom_id": "my-first-request",
+            "params": {
+                "model": "claude-3-5-sonnet-latest",
+                "max_tokens": 1024,
+                "messages": [{"role": "user", "content": "Hello, world"}],
+            },
+        },
+        {
+            "custom_id": "my-second-request",
+            "params": {
+                "model": "claude-3-5-sonnet-latest",
+                "max_tokens": 1024,
+                "messages": [{"role": "user", "content": "Hi again, friend"}],
+            },
+        },
+    ]
+)
+```
+
+
+### Getting results from a batch
+
+Once a Message Batch has been processed, indicated by `.processing_status === 'ended'`, you can access the results with `.batches.results()`
+
+```python
+result_stream = await client.beta.messages.batches.results(batch_id)
+async for entry in result_stream:
+    if entry.result.type == "succeeded":
+        print(entry.result.message.content)
+```
+
+## Tool use
+
+This SDK provides support for tool use, aka function calling. More details can be found in [the documentation](https://docs.anthropic.com/claude/docs/tool-use).
+
+## AWS Bedrock
+
+This library also provides support for the [Anthropic Bedrock API](https://aws.amazon.com/bedrock/claude/) if you install this library with the `bedrock` extra, e.g. `pip install -U anthropic[bedrock]`.
+
+You can then import and instantiate a separate `AnthropicBedrock` class, the rest of the API is the same.
+
+```py
+from anthropic import AnthropicBedrock
+
+client = AnthropicBedrock()
+
+message = client.messages.create(
+    max_tokens=1024,
+    messages=[
+        {
+            "role": "user",
+            "content": "Hello!",
+        }
+    ],
+    model="anthropic.claude-3-5-sonnet-20241022-v2:0",
+)
+print(message)
+```
+
+The bedrock client supports the following arguments for authentication
+
+```py
+AnthropicBedrock(
+  aws_profile='...',
+  aws_region='us-east'
+  aws_secret_key='...',
+  aws_access_key='...',
+  aws_session_token='...',
+)
+```
+
+For a more fully fledged example see [`examples/bedrock.py`](https://github.com/anthropics/anthropic-sdk-python/blob/main/examples/bedrock.py).
+
+## Google Vertex
+
+This library also provides support for the [Anthropic Vertex API](https://cloud.google.com/vertex-ai?hl=en) if you install this library with the `vertex` extra, e.g. `pip install -U anthropic[vertex]`.
+
+You can then import and instantiate a separate `AnthropicVertex`/`AsyncAnthropicVertex` class, which has the same API as the base `Anthropic`/`AsyncAnthropic` class.
+
+```py
+from anthropic import AnthropicVertex
+
+client = AnthropicVertex()
+
+message = client.messages.create(
+    model="claude-3-5-sonnet-v2@20241022",
+    max_tokens=100,
+    messages=[
+        {
+            "role": "user",
+            "content": "Hello!",
+        }
+    ],
+)
+print(message)
+```
+
+For a more complete example see [`examples/vertex.py`](https://github.com/anthropics/anthropic-sdk-python/blob/main/examples/vertex.py).
+
+## Using types
+
+Nested request parameters are [TypedDicts](https://docs.python.org/3/library/typing.html#typing.TypedDict). Responses are [Pydantic models](https://docs.pydantic.dev) which also provide helper methods for things like:
+
+- Serializing back into JSON, `model.to_json()`
+- Converting to a dictionary, `model.to_dict()`
+
+Typed requests and responses provide autocomplete and documentation within your editor. If you would like to see type errors in VS Code to help catch bugs earlier, set `python.analysis.typeCheckingMode` to `basic`.
+
+## Pagination
+
+List methods in the Anthropic API are paginated.
+
+This library provides auto-paginating iterators with each list response, so you do not have to request successive pages manually:
+
+```python
+from anthropic import Anthropic
+
+client = Anthropic()
+
+all_batches = []
+# Automatically fetches more pages as needed.
+for batch in client.beta.messages.batches.list(
+    limit=20,
+):
+    # Do something with batch here
+    all_batches.append(batch)
+print(all_batches)
+```
+
+Or, asynchronously:
+
+```python
+import asyncio
+from anthropic import AsyncAnthropic
+
+client = AsyncAnthropic()
+
+
+async def main() -> None:
+    all_batches = []
+    # Iterate through items across all pages, issuing requests as needed.
+    async for batch in client.beta.messages.batches.list(
+        limit=20,
+    ):
+        all_batches.append(batch)
+    print(all_batches)
+
+
+asyncio.run(main())
+```
+
+Alternatively, you can use the `.has_next_page()`, `.next_page_info()`, or `.get_next_page()` methods for more granular control working with pages:
+
+```python
+first_page = await client.beta.messages.batches.list(
+    limit=20,
+)
+if first_page.has_next_page():
+    print(f"will fetch next page using these details: {first_page.next_page_info()}")
+    next_page = await first_page.get_next_page()
+    print(f"number of items we just fetched: {len(next_page.data)}")
+
+# Remove `await` for non-async usage.
+```
+
+Or just work directly with the returned data:
+
+```python
+first_page = await client.beta.messages.batches.list(
+    limit=20,
+)
+
+print(f"next page cursor: {first_page.last_id}")  # => "next page cursor: ..."
+for batch in first_page.data:
+    print(batch.id)
+
+# Remove `await` for non-async usage.
+```
+
+## Handling errors
+
+When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `anthropic.APIConnectionError` is raised.
+
+When the API returns a non-success status code (that is, 4xx or 5xx
+response), a subclass of `anthropic.APIStatusError` is raised, containing `status_code` and `response` properties.
+
+All errors inherit from `anthropic.APIError`.
+
+```python
+import anthropic
+from anthropic import Anthropic
+
+client = Anthropic()
+
+try:
+    client.messages.create(
+        max_tokens=1024,
+        messages=[
+            {
+                "role": "user",
+                "content": "Hello, Claude",
+            }
+        ],
+        model="claude-3-5-sonnet-latest",
+    )
+except anthropic.APIConnectionError as e:
+    print("The server could not be reached")
+    print(e.__cause__)  # an underlying Exception, likely raised within httpx.
+except anthropic.RateLimitError as e:
+    print("A 429 status code was received; we should back off a bit.")
+except anthropic.APIStatusError as e:
+    print("Another non-200-range status code was received")
+    print(e.status_code)
+    print(e.response)
+```
+
+Error codes are as follows:
+
+| Status Code | Error Type                 |
+| ----------- | -------------------------- |
+| 400         | `BadRequestError`          |
+| 401         | `AuthenticationError`      |
+| 403         | `PermissionDeniedError`    |
+| 404         | `NotFoundError`            |
+| 422         | `UnprocessableEntityError` |
+| 429         | `RateLimitError`           |
+| >=500       | `InternalServerError`      |
+| N/A         | `APIConnectionError`       |
+
+## Request IDs
+
+> For more information on debugging requests, see [these docs](https://docs.anthropic.com/en/api/errors#request-id)
+
+All object responses in the SDK provide a `_request_id` property which is added from the `request-id` response header so that you can quickly log failing requests and report them back to Anthropic.
+
+```python
+message = client.messages.create(
+    max_tokens=1024,
+    messages=[
+        {
+            "role": "user",
+            "content": "Hello, Claude",
+        }
+    ],
+    model="claude-3-5-sonnet-latest",
+)
+print(message._request_id)  # req_018EeWyXxfu5pfWkrYcMdjWG
+```
+
+Note that unlike other properties that use an `_` prefix, the `_request_id` property
+*is* public. Unless documented otherwise, *all* other `_` prefix properties,
+methods and modules are *private*.
+
+### Retries
+
+Certain errors are automatically retried 2 times by default, with a short exponential backoff.
+Connection errors (for example, due to a network connectivity problem), 408 Request Timeout, 409 Conflict,
+429 Rate Limit, and >=500 Internal errors are all retried by default.
+
+You can use the `max_retries` option to configure or disable retry settings:
+
+```python
+from anthropic import Anthropic
+
+# Configure the default for all requests:
+client = Anthropic(
+    # default is 2
+    max_retries=0,
+)
+
+# Or, configure per-request:
+client.with_options(max_retries=5).messages.create(
+    max_tokens=1024,
+    messages=[
+        {
+            "role": "user",
+            "content": "Hello, Claude",
+        }
+    ],
+    model="claude-3-5-sonnet-latest",
+)
+```
+
+### Timeouts
+
+By default requests time out after 10 minutes. You can configure this with a `timeout` option,
+which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) object:
+
+```python
+from anthropic import Anthropic
+
+# Configure the default for all requests:
+client = Anthropic(
+    # 20 seconds (default is 10 minutes)
+    timeout=20.0,
+)
+
+# More granular control:
+client = Anthropic(
+    timeout=httpx.Timeout(60.0, read=5.0, write=10.0, connect=2.0),
+)
+
+# Override per-request:
+client.with_options(timeout=5.0).messages.create(
+    max_tokens=1024,
+    messages=[
+        {
+            "role": "user",
+            "content": "Hello, Claude",
+        }
+    ],
+    model="claude-3-5-sonnet-latest",
+)
+```
+
+On timeout, an `APITimeoutError` is thrown.
+
+Note that requests that time out are [retried twice by default](https://github.com/anthropics/anthropic-sdk-python/tree/main/#retries).
+
+### Long Requests
+
+> [!IMPORTANT]
+> We highly encourage you use the streaming [Messages API](https://github.com/anthropics/anthropic-sdk-python/tree/main/#streaming-responses) for longer running requests.
+
+We do not recommend setting a large `max_tokens` values without using streaming.
+Some networks may drop idle connections after a certain period of time, which
+can cause the request to fail or [timeout](https://github.com/anthropics/anthropic-sdk-python/tree/main/#timeouts) without receiving a response from Anthropic.
+
+This SDK will also throw a `ValueError` if a non-streaming request is expected to be above roughly 10 minutes long.
+Passing `stream=True` or [overriding](https://github.com/anthropics/anthropic-sdk-python/tree/main/#timeouts) the `timeout` option at the client or request level disables this error.
+
+An expected request latency longer than the [timeout](https://github.com/anthropics/anthropic-sdk-python/tree/main/#timeouts) for a non-streaming request
+will result in the client terminating the connection and retrying without receiving a response.
+
+We set a [TCP socket keep-alive](https://tldp.org/HOWTO/TCP-Keepalive-HOWTO/overview.html) option in order
+to reduce the impact of idle connection timeouts on some networks.
+This can be [overriden](https://github.com/anthropics/anthropic-sdk-python/tree/main/#Configuring-the-HTTP-client) by passing a `http_client` option to the client.
+
+## Default Headers
+
+We automatically send the `anthropic-version` header set to `2023-06-01`.
+
+If you need to, you can override it by setting default headers per-request or on the client object.
+
+Be aware that doing so may result in incorrect types and other unexpected or undefined behavior in the SDK.
+
+```python
+from anthropic import Anthropic
+
+client = Anthropic(
+    default_headers={"anthropic-version": "My-Custom-Value"},
+)
+```
+
+## Advanced
+
+### Logging
+
+We use the standard library [`logging`](https://docs.python.org/3/library/logging.html) module.
+
+You can enable logging by setting the environment variable `ANTHROPIC_LOG` to `info`.
+
+```shell
+$ export ANTHROPIC_LOG=info
+```
+
+Or to `debug` for more verbose logging.
+
+### How to tell whether `None` means `null` or missing
+
+In an API response, a field may be explicitly `null`, or missing entirely; in either case, its value is `None` in this library. You can differentiate the two cases with `.model_fields_set`:
+
+```py
+if response.my_field is None:
+  if 'my_field' not in response.model_fields_set:
+    print('Got json like {}, without a "my_field" key present at all.')
+  else:
+    print('Got json like {"my_field": null}.')
+```
+
+### Accessing raw response data (e.g. headers)
+
+The "raw" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call, e.g.,
+
+```py
+from anthropic import Anthropic
+
+client = Anthropic()
+response = client.messages.with_raw_response.create(
+    max_tokens=1024,
+    messages=[{
+        "role": "user",
+        "content": "Hello, Claude",
+    }],
+    model="claude-3-5-sonnet-latest",
+)
+print(response.headers.get('X-My-Header'))
+
+message = response.parse()  # get the object that `messages.create()` would have returned
+print(message.content)
+```
+
+These methods return a [`LegacyAPIResponse`](https://github.com/anthropics/anthropic-sdk-python/tree/main/src/anthropic/_legacy_response.py) object. This is a legacy class as we're changing it slightly in the next major version.
+
+For the sync client this will mostly be the same with the exception
+of `content` & `text` will be methods instead of properties. In the
+async client, all methods will be async.
+
+A migration script will be provided & the migration in general should
+be smooth.
+
+#### `.with_streaming_response`
+
+The above interface eagerly reads the full response body when you make the request, which may not always be what you want.
+
+To stream the response body, use `.with_streaming_response` instead, which requires a context manager and only reads the response body once you call `.read()`, `.text()`, `.json()`, `.iter_bytes()`, `.iter_text()`, `.iter_lines()` or `.parse()`. In the async client, these are async methods.
+
+As such, `.with_streaming_response` methods return a different [`APIResponse`](https://github.com/anthropics/anthropic-sdk-python/tree/main/src/anthropic/_response.py) object, and the async client returns an [`AsyncAPIResponse`](https://github.com/anthropics/anthropic-sdk-python/tree/main/src/anthropic/_response.py) object.
+
+```python
+with client.messages.with_streaming_response.create(
+    max_tokens=1024,
+    messages=[
+        {
+            "role": "user",
+            "content": "Hello, Claude",
+        }
+    ],
+    model="claude-3-5-sonnet-latest",
+) as response:
+    print(response.headers.get("X-My-Header"))
+
+    for line in response.iter_lines():
+        print(line)
+```
+
+The context manager is required so that the response will reliably be closed.
+
+### Making custom/undocumented requests
+
+This library is typed for convenient access to the documented API.
+
+If you need to access undocumented endpoints, params, or response properties, the library can still be used.
+
+#### Undocumented endpoints
+
+To make requests to undocumented endpoints, you can make requests using `client.get`, `client.post`, and other
+http verbs. Options on the client will be respected (such as retries) when making this request.
+
+```py
+import httpx
+
+response = client.post(
+    "/foo",
+    cast_to=httpx.Response,
+    body={"my_param": True},
+)
+
+print(response.headers.get("x-foo"))
+```
+
+#### Undocumented request params
+
+If you want to explicitly send an extra param, you can do so with the `extra_query`, `extra_body`, and `extra_headers` request
+options.
+
+#### Undocumented response properties
+
+To access undocumented response properties, you can access the extra fields like `response.unknown_prop`. You
+can also get all the extra fields on the Pydantic model as a dict with
+[`response.model_extra`](https://docs.pydantic.dev/latest/api/base_model/#pydantic.BaseModel.model_extra).
+
+### Configuring the HTTP client
+
+You can directly override the [httpx client](https://www.python-httpx.org/api/#client) to customize it for your use case, including:
+
+- Support for [proxies](https://www.python-httpx.org/advanced/proxies/)
+- Custom [transports](https://www.python-httpx.org/advanced/transports/)
+- Additional [advanced](https://www.python-httpx.org/advanced/clients/) functionality
+
+```python
+import httpx
+from anthropic import Anthropic, DefaultHttpxClient
+
+client = Anthropic(
+    # Or use the `ANTHROPIC_BASE_URL` env var
+    base_url="http://my.test.server.example.com:8083",
+    http_client=DefaultHttpxClient(
+        proxy="http://my.test.proxy.example.com",
+        transport=httpx.HTTPTransport(local_address="0.0.0.0"),
+    ),
+)
+```
+
+You can also customize the client on a per-request basis by using `with_options()`:
+
+```python
+client.with_options(http_client=DefaultHttpxClient(...))
+```
+
+### Managing HTTP resources
+
+By default the library closes underlying HTTP connections whenever the client is [garbage collected](https://docs.python.org/3/reference/datamodel.html#object.__del__). You can manually close the client using the `.close()` method if desired, or with a context manager that closes when exiting.
+
+```py
+from anthropic import Anthropic
+
+with Anthropic() as client:
+  # make requests here
+  ...
+
+# HTTP client is now closed
+```
+
+## Versioning
+
+This package generally follows [SemVer](https://semver.org/spec/v2.0.0.html) conventions, though certain backwards-incompatible changes may be released as minor versions:
+
+1. Changes that only affect static types, without breaking runtime behavior.
+2. Changes to library internals which are technically public but not intended or documented for external use. _(Please open a GitHub issue to let us know if you are relying on such internals.)_
+3. Changes that we do not expect to impact the vast majority of users in practice.
+
+We take backwards-compatibility seriously and work hard to ensure you can rely on a smooth upgrade experience.
+
+We are keen for your feedback; please open an [issue](https://www.github.com/anthropics/anthropic-sdk-python/issues) with questions, bugs, or suggestions.
+
+### Determining the installed version
+
+If you've upgraded to the latest version but aren't seeing any new features you were expecting then your python environment is likely still using an older version.
+
+You can determine the version that is being used at runtime with:
+
+```py
+import anthropic
+print(anthropic.__version__)
+```
+
+## Requirements
+
+Python 3.8 or higher.
+
+## Contributing
+
+See [the contributing documentation](https://github.com/anthropics/anthropic-sdk-python/tree/main/./CONTRIBUTING.md).