OpenAI announced a slew of new products and features at Monday's DevDay conference in San Francisco, including new tools to customize chatbots, build applications and handle speech that are meant to foster a robust developer communty around its generative AI technology.

The conference was OpenAI's first big public event for developers, and a chance to show how it plans to service the software development community.

"Compared to every other big tech event I’ve been to, OpenAI Dev Day is the highest 'OK, I have to go build something with this new release immediately' score,'" according to AI advisor and investor Allie K. Miller, in a post on X (formerly Twitter). "I’m talking 11/10 builder activation score. It’s incredible."

Here are three OpenAI products or platforms that might pique your interest for a generative AI project — plus one new open-source alternative being offered in response.

Probably the most eye-catching addition to OpenAI's lineup is a new, no-code way to create customized chatbots designed for specific tasks. These bots, called GPTs, will include the option to have them answer questions from specific data supplied by their authors. That should open the door to much quicker creation of domain-specific ChatGPT-powered bots for things like customer service, documentation Q&A, or connecting to a product database.

This capability should be available for Plus and Enterprise users sometime this week at https://chat.openai.com/gpts/editor.

While people have been building chatbots powered by OpenAI for a while, this promises to be a quicker and easier way to do so. GPTs also get OpenAI's web platform to publish on, the ChatGPT user experience, and a large language model (LLM) that provides the generative AI.

As someone who's spent hours trying to optimize data processing so an LLM can best answer questions about a set of uploaded documents, followed by trying to decide which front end to use and where to host it, I'm interested to see how well this works.

ChatGPT Enterprise customers will be able to create bots for internal use only. Individuals can keep their bots private or public for those with a link, and developers will also have the option to publish their GPTs on a soon-to-be-launched OpenAI GPT Store. Authors of the most popular bots may get some revenue sharing, and I expect there will be a lot of developers who want to try their hands at creating a hit.

I've been somewhat underwhelmed with the OpenAI plug-in experience of wading through available options to find one that's best for my task, and am not the only one who's a bit skeptical of how the new Store will ultimately work. OpenAI CEO Sam Altman pledged that the company will make sure GPTs published in the store will follow acceptable policies. However, based on other markets like Google Play, this is more difficult than it sounds. Still, given OpenAI's current popularity, there should be plenty of people who give the GPT Store a look when it launches.

This new API offers GPTs functionality to developers who would like to build their own apps instead of hosting a bot at OpenAI. The API offers easier ways to handle things like threads and lengthy conversations than coding an app with a basic LLM API. In addition, this API can call multiple functions at the same time — with more likelihood than before that what the LLM returns is properly formed JSON to be used in future steps, according to the company.

For people working on chatbots that answer questions about specific information such as software documentation, tasks like splitting texts into chunks and generating embeddings for semantic search are taken care of in the background.

A demo app powered by the OpenAI Assistant API calls Code Interpreter to write and execute Python code behind the scenes.

The Assistant API also has access to Code Interpreter for running sandboxed Python code. Once enabled, Code Interpreter kicks in if the LLM decides that a user's question requires some calculations. As an example, the OpenAI DevDay keynote featured a travel app powered by the Assistant API. When a user uploaded their flight and Airbnb bills and asked "We'll be 4 friends staying at this Airbnb. What's my share of it + my flight?", the LLM called for the Code Interpreter to generate Python code and then answered the question.

"OpenAI Assistant is now available in the [OpenAI] playground," Miller tweeted. "This is the most insane way to use natural language to program your own bot. Literally upload an entire textbook and build your own tutor. Upload technical docs and teach yourself a new program." The playground dashboard lets programmers see the steps the AI is taking to respond to queries.

This API is a text-to-speech endpoint for OpenAI's TTS model. It includes six different voices, and initial comments on the voice quality have been favorable.

The response format is MP3 but others are possible. And, it supports real-time audio streaming.

"My usual approach to understanding new APIs is to build something with them, so I've built a new tool," open-source developer Simon Willison posted on Mastodon. "**ospeak: a CLI tool for speaking text in the terminal via OpenAI".

You can see details about the API at https://platform.openai.com/docs/guides/text-to-speech, and Willison's tool at https://simonwillison.net/2023/Nov/7/ospeak/

As an alternative to OpenAI tools, LangChain, which provides a framework for the development of apps built with LLMs, launched OpenGPTs. The chatbot development tool "is an open source effort to create a similar experience as OpenAI's GPTs," according to LangChain's GitHub. "This gives you more control over the LLM you use (choose between the 60+ that LangChain offers), the prompts you use (use LangSmith to debug those), and the tools you give it (choose from LangChain's 100+ tools, or easily write your own... As much as possible, we are striving for feature parity with OpenAI."

The simple, sample hosted version isn't quite as slick as OpenAI's version, although it was was probably put together in less than a day. As it evolves, it may appeal to developers who don't want to be locked into the OpenAI ecosystem. LangChain is working on a hosted version for those who want flexibility of tool choices but aren't interested in managing their own cloud host.

For now, you can run a local version with a Python installation for the back end. It uses React, TypeScript and Vite for the front end. There's more info in the repo's README file.

Next read this:

IT World