Newsletter

About Us

Blog

Contact

Newsletter

About Us

Blog

Contact

What Are AI Assistants?

Jan 8, 2024

Outline

Introduction to Large Language Models and Creative Assistance

Large language models are making our creative lives easier. Most of us have already used them in some form or another. Let’s take them to the next level!

Chat-Like Interfaces for Large Language Models (LLMs)

The easiest and most straight-forward way to interact with LLMs is by using a prompting (i.e. chat) interface, such as ChatGPT integrate them natively into your manual workflows and processes. “Chatting” to an LLM proves useful: You describe your task and ask it what you want as a result in a prompt. You then wait, inspect the result and if satisfactory you use it in your work. If it’s not quite as you wanted it, you refine it or give further instructions. This is useful for simple tasks, such as extracting information out of large pieces of text, rewriting text, ideating and a myriad of other one-time tasks.

Limitations of the Chat-Like Interface

Things get more complicated if you want to do more. Let’s say you’re writing a blog article that you want to rank high on search engines. Considering a few SEO best practices, you would

  • Structure the text into relevant sections,

  • Include meaningful headings,

  • Research relevant keywords for your topic,

  • Integrate keywords seamlessly into the text,

  • Research relevant outbound links (SEO best practices link hint)

  • … and the list goes on.

It is very hard to do this in one prompt. What you’d have to do is to start with your “raw data” and iteratively refine the article. You’d first ask the LLM to structure the text into sections and include headings. You would then provide it with keywords manually and ask it for seamless integrations. And so on for every other point.

Isn’t there a way to let the LLM do this for you automatically?

How Do AI Assistants Utilize LLMs?

If you’re looking for a way for the LLM to continue on its own until it reaches the desired result, this is precisely what AI assistants do! They guide the LLM to achieving your desired output. In other words, they are a way of automating the LLM’s steps such that you end up with the final result and don’t have to do anything else than to specify a “goal”. The assistant will supervise the LLM for you.

To go just a tiny bit deeper into the technical aspects, the LLM and the assistant can be viewed as two different logical components, even though they work together in tandem. It’s easier to grasp the concept when viewing them as different logical entities.

We know that the LLM responds to prompts. An assistant will re-feed the LLM with its own output together with relevant further information at each step, as required by the LLM. How does the assistant know what information the LLM requires? Well, the LLM will tell it what it needs.

How Do AI Assistants Integrate with Existing Technology?

You can make an LLM request external information when necessary by prompting it in a certain way. It will then ask for this information, just like a human assistant would. In our blog post example, the LLM will ask for external links to include in the article. It will mention this in its output. The assistant reads the output, goes online, searches for relevant links and returns them in a new prompt to the LLM. This back-and-forth between the assistant and the LLM happens at every step. The assistant will use various “tools” to fetch external information. Once all steps are done the LLM stops requesting further information and will output the final result.

It’s important to note that the LLM steers the assistant. It constitutes the “brain” of this interplay. The assistant only executes and replays data to the LLM. It’s also in charge of re-running the LLM on the new data. Also, the assistant gives “life” to the LLM, so-to-speak. Without the assistant we’d have to re-run the LLM ourselves, or it will not run at all.

In essence, instead of you having to inspect the output of the intermediate steps for our blog post example, you let the assistant take care of those intermediate steps. Most importantly, you don’t need to fetch the external information yourself, such as the external links for the blog article. Note that there is automation happening on two levels: LLM supervision (i.e. running the LLM) and data gathering from external sources.

How Can AI Assistants Improve Productivity?

Let’s have a look at the “tools” concept. These are the external programs that the assistant can access on behalf of the LLM. In our example, such a tool can be a web browser that navigates to a search engine and returns a list of relevant links. Other tools can be fetchers from databases, emails, messages or any other knowledge source relevant for the task at hand. Or they can schedule events in a calendar, flights, buy groceries and actually “do” things in the real world. You can connect any API with an assistant, so the possibilities are only limited by your own imagination. AI assistants reduce manual work significantly!

LLM-based AI Assistant Technology

Assistant technology already exists. Some examples include the LangChain framework, Microsoft’s AutoGen and OpenAI’s custom GPTs. These vary in terms of technical complexity for the end user. LangChain and AutoGen are developer-oriented, but offer the best flexibility, whereas custom GPTs are the most user-friendly and the best to start with. Feel free to contact us at Next Operations and our development team is happy to assist you with custom AI assistants.

Get in touch!

Exploring a similar use-case at your company?

OpenAI Custom GPTs

We’ll focus on custom GPTs in this article. They require no programming knowledge and the whole assistant logic is already provided by OpenAI and ready to use. They’re not fully automatic (you still have to supervise the assistant and re-run it by hand) but they offer a great deal in terms of external tool support, which is already a huge step forward from just LLMs alone.

Plug-Ins: The Main Connection to the Outside World

The tools they use are called “plug-ins” and were released by OpenAI shortly before custom GPTs. Some examples are the Data Analysis plug-in or the web browser plug-in. The former writes code and runs it on provided data sets and the latter browses the web for up-to-date information to serve the user’s request. There are also a myriad of community-written plug-ins, such as plug-ins connected to flight or hotel booking platforms.

Agents also make use of “function calling” which allows them to be connected to any API, including yours! This way they are enriched with whatever is relevant to your business and can solve complex problems.

In general, plug-ins solve two basic limitations of LLMs:

  1. LLM knowledge is limited to the time when they were trained (cut-off date). Hence the LLM acts more like an engine than a knowledge base.

    • Solution: plug-ins augment the prompt by providing external context to the LLM

  2. LLMs only understand and produce text, they can’t do anything more with it, i.e. execute the instruction that they output themselves.

    • Solution: they redirect the instructions to the respective platform, i.e. fetch API endpoints and execute generated code on the provided user data

This is a great enhancement over basic LLM capabilities.

Can AI Assistants Replace Human Jobs?

As OpenAI puts it on their blog: “Plugins will likely have wide-ranging societal implications. For example, we recently released a working paper which found that language models with access to tools will likely have much greater economic impacts than those without, and more generally, in line with other researchers’ findings, we expect the current wave of AI technologies to have a big effect on the pace of job transformation, displacement, and creation. We are eager to collaborate with external researchers and our customers to study these impacts.” In this sense, LLMs can be viewed as engines rather than knowledge bases. Their real power lies in solving company-specific problems rather than being one-size-fits-all solutions.

In other words, LLMs with plug-ins as external data sources and action possibilities have a massive productivity impact. We at Next Operations are at the forefront of these advancements and offer proof-of-concept implementations that show you the immediate benefit that these models can have on your business. We don’t think that LLMs or AI in general will replace humans. But as with any technology, staying up-to-date and upskilling is important, thereby integrating AI into the modern workplace.

Conclusion

In the rapidly evolving digital landscape, AI assistants are becoming indispensable tools for businesses and individuals alike. By leveraging the capabilities of Large Language Models, these intelligent systems streamline complex tasks, enhance productivity, and foster innovation. Whether it’s through automating routine tasks or providing sophisticated solutions for unique challenges, AI assistants are reshaping the way we work and interact with technology. As we continue to explore the potential of AI personal assistant for business applications, it’s clear that these advanced tools will play a pivotal role in driving efficiency and success in the modern workplace.

Frequently Asked Questions

What are the AI assistants?

AI assistants are advanced software programs that utilize Large Language Models (LLMs) to automate tasks, process information, and assist users in achieving their goals more efficiently.

How Do AI Assistants Utilize LLMs?

AI assistants utilize LLMs by guiding them through a series of prompts and tasks, automating the process of generating outputs that align with the user’s objectives.

Which is the best AI assistant?

The best AI assistant varies depending on the user’s specific needs and the complexity of tasks. OpenAI’s custom GPTs are user-friendly and widely regarded for their versatility and ease of use.

Is there an AI assistant online that I can use for my business?

Yes, there are several AI assistant online platforms available that can be tailored to business needs, such as OpenAI’s custom GPTs and other industry-specific solutions.

Can an AI personal assistant for business really improve my company’s productivity?

Absolutely, an AI personal assistant for business can significantly improve productivity by automating routine tasks, optimizing workflows, and providing quick access to information, allowing employees to focus on more strategic activities.

Boost Efficiency!

Interested in building generative AI into your company? Embrace AI for smarter, faster decision-making. Get in touch to see how we can automate your manual processes.

Book Your Free AI Discovery Call

Save up to 20 hours a week by reducing manual work and repetitive tasks. Book a free consulting call by filling out the form below. We'll get back to you by end-of-day!

“Robert showed a great ability to find and go for projects with direct business impact”

“Robert is an absolute self-starter and one of the most entrepreneurially minded engineers I have met. He is great at structuring big workstreams efficiently. I think he will be a great asset to any firm hiring him.”

Benjamin Westrich

Quant Researcher at DRW

© 2024 Next Operations | Your Partner in Integrating Generative AI

Book Your Free AI Discovery Call

Save up to 20 hours a week by reducing manual work and repetitive tasks. Book a free consulting call by filling out the form below. We'll get back to you by end-of-day!

“Robert showed a great ability to find and go for projects with direct business impact”

“Robert is an absolute self-starter and one of the most entrepreneurially minded engineers I have met. He is great at structuring big workstreams efficiently. I think he will be a great asset to any firm hiring him.”

Benjamin Westrich

Quant Researcher at DRW

© 2024 Next Operations | Your Partner in Integrating Generative AI

Book Your Free AI Discovery Call

Save up to 20 hours a week by reducing manual work and repetitive tasks. Book a free consulting call by filling out the form below. We'll get back to you by end-of-day!

“Robert showed a great ability to find and go for projects with direct business impact”

“Robert is an absolute self-starter and one of the most entrepreneurially minded engineers I have met. He is great at structuring big workstreams efficiently. I think he will be a great asset to any firm hiring him.”

Benjamin Westrich

Quant Researcher at DRW

© 2024 Next Operations | Your Partner in Integrating Generative AI