httpx.ConnectError: [Errno 8] nodename nor servname provided, or not known

Ludwig, Dana 0 Reputation points
2025-12-09T05:14:44.2666667+00:00

Here is the script I submitted. It failed on client.responses.create():

======================

environment_file = ".env"

from dotenv import load_dotenv, find_dotenv

_ = load_dotenv(environment_file)  

from openai import OpenAI

import os

client = OpenAI(

    api_key = os.getenv("OPENAI_API_KEY"),  

    base_url="https://dana-instance-gpt51-instance2.openai.azure.com/openai/v1/",

    )

print("we completed the client open")

response = client.responses.create(

    model="gpt-5.1",

    input="Write a haiku about code.",

    reasoning={"effort":"low"},

    text={"verbosity":"low"},

)

print(response.output_text)

exit()

=====

Here is the stack trace:

=========

(adrenv) danaludwig@Danas-MacBook-Pro-826 Scripts % python3 tmp.py

we completed the client open

Traceback (most recent call last):

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions

    yield

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpx/_transports/default.py", line 250, in handle_request

    resp = self._pool.handle_request(req)

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request

    raise exc from None

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request

    response = connection.handle_request(

        pool_request.request

    )

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpcore/_sync/connection.py", line 101, in handle_request

    raise exc

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpcore/_sync/connection.py", line 78, in handle_request

    stream = self._connect(request)

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpcore/_sync/connection.py", line 124, in _connect

    stream = self._network_backend.connect_tcp(**kwargs)

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpcore/_backends/sync.py", line 207, in connect_tcp

    with map_exceptions__(exc_map)__:

         ~~~~~~~~~~~~~~^^^^^^^^^

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/contextlib.py", line 162, in exit

    self.gen.throw__(value)__

    ~~~~~~~~~~~~~~^^^^^^^

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions

    raise to_exc(exc) from exc

httpcore.ConnectError: [Errno 8] nodename nor servname provided, or not known

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/openai/_base_client.py", line 982, in request

    response = self._client.send(

        request,

        stream=stream or self._should_stream_response_body(request=request),

        **kwargs,

    )

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpx/_client.py", line 914, in send

    response = self._send_handling_auth(

        request,

    ...<2 lines>...

        history=[],

    )

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpx/_client.py", line 942, in _send_handling_auth

    response = self._send_handling_redirects(

        request,

        follow_redirects=follow_redirects,

        history=history,

    )

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpx/_client.py", line 979, in _send_handling_redirects

    response = self._send_single_request(request)

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpx/_client.py", line 1014, in _send_single_request

    response = transport.handle_request(request)

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpx/_transports/default.py", line 249, in handle_request

    with map_httpcore_exceptions__()__:

         ~~~~~~~~~~~~~~~~~~~~~~~^^

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/contextlib.py", line 162, in exit

    self.gen.throw__(value)__

    ~~~~~~~~~~~~~~^^^^^^^

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions

    raise mapped_exc(message) from exc

httpx.ConnectError: [Errno 8] nodename nor servname provided, or not known

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

  File "/Users/danaludwig/Documents/a_Inspir E1505 Desktop-backup folder/A_UCSF_Research/Adverse Drug Events/repo_for_ade_paper/Scripts/tmp.py", line 14, in <module>

    response = client.responses.create(

        model="gpt-5.1",

    ...<2 lines>...

        text={"verbosity":"low"},

    )

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/openai/resources/responses/responses.py", line 866, in create

    return self.post_(__

           ~~~~~~~~~~^

        "/responses",

        ^^^^^^^^^^^^^

    ...<40 lines>...

        stream_cls=Stream[ResponseStreamEvent],

        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

    )

    ^

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/openai/_base_client.py", line 1259, in post

    return cast(ResponseT, self.request__(cast_to, opts, stream=stream, stream_cls=stream_cls)__)

                           ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/Users/danaludwig/anaconda3/envs/adrenv/lib/python3.13/site-packages/openai/_base_client.py", line 1014, in request

    raise APIConnectionError(request=request) from err

openai.APIConnectionError: Connection error.

==============

Thank you! - Dana Ludwig

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
{count} votes

1 answer

Sort by: Most helpful
  1. Ludwig, Dana 0 Reputation points
    2025-12-10T17:36:31.9333333+00:00

    One more point. A lot of these headaches are my problem, but I would like to give you suggestions that would help other "slow learners" like me:

    This is miles from the self-service model that is needed for the above customer like me. This isn't a point-and-click, enter your credit card number, then get a result. You have too many mandatory layers that have to be correct, and that require significant research (thank you Google AI support). I should not have to find name "US East 2" to get your most common model, gpt-5.1. I should not have to (1) register, (2) subscribe, (3) create a resource and (4) create a deployment to make my "hello world" script work with your single recommended LLM model, "gpt-5.1". I should not be constantly spammed to register for "foundry", which I may or may not need to get my "hello world" model to work. I should CERTAINLY not have to pay $37/month level-2 support to use your community Q/A forum!!! All of these barriers are going to keep Microsoft in the backwaters of cloud services until this workflow is dramatically simplified.

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.