LangChain in Action: Translating, Summarizing, and Analyzing Text Across Languages

Anna C S Medeiros
3 min readJun 1, 2024

--

LangChain is revolutionizing how developers utilize large language models (LLMs) by enabling complex, multi-step text processing tasks. This powerful framework allows for easy chaining of tasks like translation, summarization, and sentiment analysis.

Here we will explore a practical application of LangChain to process English text through translation to Brazilian Portuguese, followed by summarization and sentiment analysis.

If you have never used LangChain or OpenAI start here.

1. Setup and dependencies

Setup your environment with the following:

pyhton=3.10
langchain=0.2.1
langchain-openai=0.1.8

To use LangChain for such tasks, you first need to import the necessary modules and define your prompts. The PromptTemplate class from langchain_core defines how inputs should be transformed, while OpenAI from langchain_openai allows integration with OpenAI’s powerful models.

from langchain_core.prompts import PromptTemplate
from langchain_openai import OpenAI

2. Defining Prompts

The prompts guide the LLM on what task to perform.

For our scenario, we need three prompts:

  • Translation from English to Brazilian Portuguese.
  • Summarization of the translated Portuguese text.
  • Sentiment analysis of the summary.
translate_template = "Translate the following English text to Brazilian Portuguese: {text}"
summary_template = "Summarize the following Portuguese text: {text}"
sentiment_template = "Analyze the sentiment of the following Portuguese text: {text}"

translate_prompt = PromptTemplate.from_template(translate_template)
summary_prompt = PromptTemplate.from_template(summary_template)
sentiment_prompt = PromptTemplate.from_template(sentiment_template)

3. Initializing the Language Model

Next, we initialize the OpenAI model, setting the temperature parameter to 1 for maximum creativity. This setup is particularly useful in tasks like translation and creative writing where a bit of inventiveness is beneficial.

llm = OpenAI(temperature=1)

# We will also print the default model name beeing used,
# just for curiosity purposes:
print(f'Default model is: {llm.model_name}')

4. Creating and Running the Chain

With the prompts defined and the model initialized, we then create a chain of operations. This chain translates the input text, summarizes the translation, and finally analyzes the sentiment of that summary. LangChain’s chaining capability allows us to link multiple tasks.

chain = (
translate_prompt | llm |
summary_prompt | llm |
sentiment_prompt | llm
)

5. Invoking the Chain

To see the chain in action, we simply invoke it with an input text:

result = chain.invoke({"text": "I miss you, but it's clear that you are having the time of your life."})

print(result)

""" my output:
A sentiment analysis of this Portuguese text reveals a mixed sentiment.
The author expresses saudades (longing or missing someone) for the person
mentioned, indicating a feeling of sadness or nostalgia. However, the
author also recognizes that the person is enjoying a happy moment in their
life, suggesting a sense of happiness or contentment for the person.
Overall, the sentiment can be described as bittersweet.
"""

Your output will probably be different, as generative AI is non-deterministic. A generative model must be probabilistic rather than deterministic because we want to be able to sample many different variations of the output, rather than get the same output every time.

TLDR; Here is the full code:

import os

os.environ['OPENAI_API_KEY'] = 'my-secret-api-key'


from langchain_core.prompts import PromptTemplate
from langchain_openai import OpenAI

translate_template = "Translate the following English text to Brazilian Portuguese: {text}"
summary_template = "Summarize the following Portuguese text: {text}"
sentiment_template = "Analyze the sentiment of the following Portuguese text: {text}"

translate_prompt = PromptTemplate.from_template(translate_template)
summary_prompt = PromptTemplate.from_template(summary_template)
sentiment_prompt = PromptTemplate.from_template(sentiment_template)

llm = OpenAI(temperature=1)

print(f'Default model is: {llm.model_name}')

chain = (
translate_prompt | llm |
summary_prompt | llm |
sentiment_prompt | llm
)

result = chain.invoke({"text": "I miss you, but its clear that you are having the time of your life."})
print(result)

Thanks for reading!

Give it some claps to make others find it, too! Also, Make sure you follow me on Medium to not miss anything. Let’s connect on LinkedIn.

--

--

Anna C S Medeiros
Anna C S Medeiros

Written by Anna C S Medeiros

Senior Data Scientist @ Vsoft | GenAI | Computer Vision | NLP | LLM

No responses yet