Asterisk to OpenAI Realtime - Auto-install Script

Hello Asterisk community!
I would like to share with you a tutorial post on how to connect Asterisk 20+ with OpenAI Realtime models, in an easy and simple way, with just one command!

This time I’m using Azure to test the lowest possible latency, considering that the OpenAI inference servers run on the same cloud.

You need:
Azure Ubuntu 24 instance, 1 vcpu, 2 Gb of Ram.
OpenAI API Key.

Greetings!

I think you are refining the definition of cloud here. I understand “cloud” to be the internet, so there is only one cloud.

I also don’t think there is any guarantee that the machines with the big GPUs are in the same data centre, as the one to which general use VMs will get allocated. The term “cloud” was came about because people didn’t need to understand the actual structure. It’s sort of “black box” applied to a distributed system.

It may be that they are in the same centre in your case, but I don’t think that will be part of the contract, may change, and may depend on the customer location.

1 Like

Hi, well apart from what is internet or cloud, I was curious about the latency from different public cloud providers to the OpenAI Api gateway, so I did some basic testing, nothing fine-tuned, just used simple working instances and got this data to compare:

Ping/Tracert to api.openai.com

From Google Cloud Instance:

From AWS Instance:

From Azure Instance, tracert doesn’t work as usual, but using nmap we can see how many hops:

Ok, we can see that Google is a bit far away, I think it is better for the integration with Dialogflow, but that is another thing.
AWS looks great, the current CDN for the Openai API is Cloudflare and they sure have dedicated links and fine-tuned integration.
Azure looks solid, and fewer hops to get to the Openai API gateway.

So, as you like! The good thing about the project is that you can deploy it wherever you like, local or cloud, try it and tell us how it went.

Greetings!

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.