How To Set up Your First AI Automation

How To Set up Your First AI Automation

How To Set Up Your First AI Automation 

AI is evolving at lightning speed. From writing emails to generating art and coding applications, Large Language Models (LLMs) have transformed how we work. But LLMs have limits—they don’t remember past interactions, can’t take independent actions, and won’t automate workflows without help.

Enter AI agents—LLMs with superpowers. Unlike traditional AI, AI agents can reason, interact with tools, and execute tasks autonomously. Instead of just answering questions, they take action, automating research, managing workflows, and even making decisions.

How AI Agents Work

AI agents combine five key elements:

  • Tools 🛠️ – Extend AI capabilities (e.g., searching the web, calling APIs, sending emails).

  • Memory 🧠 – Retain past interactions for more context-aware responses.

  • Environment 🌍 – Interact with apps, databases, or the internet.

  • Actions 🎬 – Perform real tasks beyond static responses.

  • Observations 👀 – Evaluate results and refine decisions in real-time.

Building an AI Agent to Fetch LinkedIn Profiles

Let’s build a simple AI agent that finds LinkedIn profiles based on a name input. We’ll use LangChain, Tavily Search API, and GoogleSearch from Phidata to automate this task.

Step 1: Set Up Your Environment

Install the required Python libraries:

pip install langchain langchain_community langchain_tools dotenv phidata googlesearch-python

Step 2: Get API Keys

Obtain keys for:

  • Tavily API – Enables precise LinkedIn searches.

  • Groq API – Provides free LLM access. Store these in a .env file:

TAVILY_API_KEY=your_key_here
GROQ_API_KEY=your_key_here

Step 3: Create Search Functions

from dotenv import load_dotenv
from langchain_community.tools.tavily_search import TavilySearchResults

load_dotenv()

def search_profile_url_using_tavily(name):
    search = TavilySearchResults()
    return search.run(name)

For Google search, use:

from phi.tools.googlesearch import GoogleSearch

Step 4: Build the AI Agent

from langchain.prompts import PromptTemplate
from langchain_groq import ChatGroq
from langchain.tools import Tool
from langchain.agents import create_react_agent, AgentExecutor
from langchain import hub

# Load environment variables
load_dotenv()

def get_linkedin_url(name: str) -> str:
    groq_llm = ChatGroq(model="llama-3.3-70b-versatile", temperature=0)
    
    prompt_template = PromptTemplate(input_variables=["name_of_person"], template=
        """Given the full name {name_of_person}, fetch the LinkedIn profile URL.""")
    
    tools = [
        Tool(name="Google Search", func=search_profile_url_using_tavily, description="Search for LinkedIn profiles"),
        Tool(name="GoogleSearch", func=GoogleSearch().google_search, description="Perform Google search")
    ]
    
    agent = create_react_agent(llm=groq_llm, tools=tools, prompt=hub.pull("hwchase17/react"))
    agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True, handle_parsing_errors=True)
    
    result = agent_executor.invoke(input={"input": prompt_template.format_prompt(name_of_person=name)})
    return result["output"]

Step 5: Run the Agent

Call the function with a name to fetch a LinkedIn profile URL:

print(get_linkedin_url("John Doe"))

Conclusion

AI agents are reshaping automation by combining reasoning, action, and decision-making. Unlike standard LLMs, they go beyond responses to execute real-world tasks. This simple project is just the beginning—AI agents can automate research, customer interactions, and more.

Want more AI automation insights? Follow me for the latest updates! 🚀

 

Back to blog