Skip to content

Available Models from OpenAI, Huggingface, TogetherAI, and more

The Swarms Framework: A Comprehensive Guide to Model APIs and Usage

Introduction

The Swarms framework is a versatile and robust tool designed to streamline the integration and orchestration of multiple AI models, making it easier for developers to build sophisticated multi-agent systems. This blog aims to provide a detailed guide on using the Swarms framework, covering the various models it supports, common methods, settings, and practical examples.

Overview of the Swarms Framework

Swarms is a "framework of frameworks" that allows seamless integration of various AI models, including those from OpenAI, Anthropic, Hugging Face, Azure, and more. This flexibility enables users to leverage the strengths of different models within a single application. The framework provides a unified interface for model interaction, simplifying the process of integrating and managing multiple AI models.

Getting Started with Swarms

To get started with Swarms, you need to install the framework and set up the necessary environment variables. Here's a step-by-step guide:

Installation

You can install the Swarms framework using pip:

pip install swarms

Setting Up Environment Variables

Swarms relies on environment variables to manage API keys and other configurations. You can use the dotenv package to load these variables from a .env file.

pip install python-dotenv

Create a .env file in your project directory and add your API keys and other settings:

OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
AZURE_OPENAI_ENDPOINT=your_azure_openai_endpoint
AZURE_OPENAI_DEPLOYMENT=your_azure_openai_deployment
OPENAI_API_VERSION=your_openai_api_version
AZURE_OPENAI_API_KEY=your_azure_openai_api_key
AZURE_OPENAI_AD_TOKEN=your_azure_openai_ad_token

Using the Swarms Framework

Swarms supports a variety of models from different providers. Here are some examples of how to use these models within the Swarms framework.

Using the Anthropic Model

The Anthropic model is one of the many models supported by Swarms. Here's how you can use it:

import os
from swarm_models import Anthropic

# Load the environment variables
anthropic_api_key = os.getenv("ANTHROPIC_API_KEY")

# Create an instance of the Anthropic model
model = Anthropic(anthropic_api_key=anthropic_api_key)

# Define the task
task = "What is quantum field theory? What are 3 books on the field?"

# Generate a response
response = model(task)

# Print the response
print(response)

Using the HuggingfaceLLM Model

HuggingfaceLLM allows you to use models from Hugging Face's vast repository. Here's an example:

from swarm_models import HuggingfaceLLM

# Define the model ID
model_id = "NousResearch/Yarn-Mistral-7b-128k"

# Create an instance of the HuggingfaceLLM model
inference = HuggingfaceLLM(model_id=model_id)

# Define the task
task = "Once upon a time"

# Generate a response
generated_text = inference(task)
print(generated_text)

Using the OpenAIChat Model

The OpenAIChat model is designed for conversational tasks. Here's how to use it:

import os
from swarm_models import OpenAIChat

# Load the environment variables
openai_api_key = os.getenv("OPENAI_API_KEY")

# Create an instance of the OpenAIChat model
openai = OpenAIChat(openai_api_key=openai_api_key, verbose=False)

# Define the task
chat = openai("What are quantum fields?")
print(chat)

Using the TogetherLLM Model

TogetherLLM supports models from the Together ecosystem. Here's an example:

from swarms import TogetherLLM

# Initialize the model with your parameters
model = TogetherLLM(
    model_name="mistralai/Mixtral-8x7B-Instruct-v0.1",
    max_tokens=1000,
    together_api_key="your_together_api_key",
)

# Run the model
response = model.run("Generate a blog post about the best way to make money online.")
print(response)

Using the Azure OpenAI Model

The Azure OpenAI model is another powerful tool that can be integrated with Swarms. Here's how to use it:

import os
from dotenv import load_dotenv
from swarms import AzureOpenAI

# Load the environment variables
load_dotenv()

# Create an instance of the AzureOpenAI class
model = AzureOpenAI(
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
    deployment_name=os.getenv("AZURE_OPENAI_DEPLOYMENT"),
    openai_api_version=os.getenv("OPENAI_API_VERSION"),
    openai_api_key=os.getenv("AZURE_OPENAI_API_KEY"),
    azure_ad_token=os.getenv("AZURE_OPENAI_AD_TOKEN"),
)

# Define the prompt
prompt = (
    "Analyze this load document and assess it for any risks and"
    " create a table in markdown format."
)

# Generate a response
response = model(prompt)
print(response)

Using the GPT4VisionAPI Model

The GPT4VisionAPI model can analyze images and provide detailed insights. Here's how to use it:

import os
from dotenv import load_dotenv
from swarms import GPT4VisionAPI

# Load the environment variables
load_dotenv()

# Get the API key from the environment variables
api_key = os.getenv("OPENAI_API_KEY")

# Create an instance of the GPT4VisionAPI class
gpt4vision = GPT4VisionAPI(
    openai_api_key=api_key,
    model_name="gpt-4o",
    max_tokens=1000,
    openai_proxy="https://api.openai.com/v1/chat/completions",
)

# Define the URL of the image to analyze
img = "ear.png"

# Define the task to perform on the image
task = "What is this image"

# Run the GPT4VisionAPI on the image with the specified task
answer = gpt4vision.run(task, img, return_json=True)

# Print the answer
print(answer)

Using the QwenVLMultiModal Model

The QwenVLMultiModal model is designed for multi-modal tasks, such as processing both text and images. Here's an example of how to use it:

from swarms import QwenVLMultiModal

# Instantiate the QwenVLMultiModal model
model = QwenVLMultiModal(
    model_name="Qwen/Qwen-VL-Chat",
    device="cuda",
    quantize=True,
)

# Run the model
response = model("Hello, how are you?", "https://example.com/image.jpg")

# Print the response
print(response)

Common Methods in Swarms

Swarms provides several common methods that are useful across different models. One of the most frequently used methods is __call__.

The __call__ Method

The __call__ method is used to run the model on a given task. Here is a generic example:

# Assuming `model` is an instance of any supported model
task = "Explain the theory of relativity."
response = model(task)
print(response)

This method abstracts the complexity of interacting with different model APIs, providing a consistent interface for executing tasks.

Common Settings in Swarms

Swarms allows you to configure various settings to customize the behavior of the models. Here are some common settings:

API Keys

API keys are essential for authenticating and accessing the models. These keys are typically set through environment variables:

import os

# Set API keys as environment variables
os.environ['OPENAI_API_KEY'] = 'your_openai_api_key'
os.environ['ANTHROPIC_API_KEY'] = 'your_anthropic_api_key'

Model-Specific Settings

Different models may have specific settings that need to be configured. For example, the AzureOpenAI model requires several settings related to the Azure environment:

model = AzureOpenAI(
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
    deployment_name=os.getenv("AZURE_OPENAI_DEPLOYMENT"),
    openai_api_version=os.getenv("OPENAI_API_VERSION"),
    openai_api_key=os.getenv("AZURE_OPENAI_API_KEY"),
    azure_ad_token=os.getenv("AZURE_OPENAI_AD_TOKEN"),
)

Advanced Usage and Best Practices

To make the most out of the Swarms framework, consider the following best practices:

Extensive Logging

Use logging to monitor the behavior and performance of your models. The loguru library is recommended for its simplicity and flexibility:

from loguru import logger

logger.add("file.log", rotation="10 MB")

# Log model interactions
logger.info("Running task on Anthropic model")
response = model(task)
logger.info(f"Response: {response}")

Error Handling

Implement robust error handling to manage API failures and other issues gracefully:

try:
    response = model(task)
except Exception as e:
    logger.error(f"Error running task: {e}")
    response = "An error occurred while processing your request."
print(response)

Conclusion

The Swarms framework provides a powerful and flexible way to integrate and manage multiple AI models within a single application. By following the guidelines and examples provided in this blog, you can leverage Swarms to build sophisticated, multi-agent systems with ease. Whether you're using models from OpenAI, Anthropic, Azure, or Hugging Face,

Swarms offers a unified interface that simplifies the process of model orchestration and execution.