🚀 Build a Prompt Generator App with Angular, Tailwind CSS & Google Gemini API
Manthan Ankolekar

Manthan Ankolekar @manthanank

About: Software Developer | JavaScript | Angular | Nodejs | MongoDB | Express.js | Python | Firebase | MySQL | Postgresql |

Location:
Karnataka, India
Joined:
Feb 21, 2021

🚀 Build a Prompt Generator App with Angular, Tailwind CSS & Google Gemini API

Publish Date: Jul 17
5 0

If you're looking to explore Generative AI in your Angular apps, this blog is for you! Learn how I built a modern, full-featured Prompt Generator using Angular, Tailwind CSS, and the Gemini API—with a secure Node.js/Express backend to keep things safe.

🌟 What Is the Prompt Generator?

The Prompt Generator is a web app that allows you to:

✅ Generate creative, formal, funny, or concise AI prompts
✅ Copy or save prompts to favorites
✅ View history of generated prompts
✅ Toggle between light and dark mode
✅ Enjoy a responsive and modern UI with Tailwind CSS
✅ Use a secure backend to protect your API key

Live Demo: prompt-generator-application.vercel.app
GitHub Repo: https://github.com/manthanank/prompt-generator


🏗️ Tech Stack

  • Frontend: Angular 20+, Tailwind CSS 4, ngx-markdown
  • Backend: Node.js + Express
  • AI Engine: Google Gemini Pro (via @google/genai SDK)

🔒 Why Use a Backend?

Google's Gemini API requires your API key. Exposing it in frontend code is risky. That's why this app routes all AI requests through a secure Express server, hosted on Vercel.


📁 Project Structure

prompt-generator/
├── backend/          → Express API for Gemini
├── src/              → Angular frontend
│   ├── app/          → Standalone Angular components
│   ├── environments/ → Environment config
│   └── styles.css    → Tailwind + Dark mode styles
Enter fullscreen mode Exit fullscreen mode

🔧 Setup & Installation

1. Clone the Repository

git clone https://github.com/manthanank/prompt-generator.git
cd prompt-generator
Enter fullscreen mode Exit fullscreen mode

2. Install Frontend & Backend Dependencies

npm install
cd backend
npm install
Enter fullscreen mode Exit fullscreen mode

3. Configure the Environment

Create a .env file inside backend/:

GEMINI_API_KEY=your_gemini_api_key_here
PORT=3000
Enter fullscreen mode Exit fullscreen mode

4. Start Development Servers

# In backend/
npm start

# In frontend (root)
npm start
Enter fullscreen mode Exit fullscreen mode

🧠 How It Works

🧾 Prompt Generation Flow:

  1. User Input: Choose style + enter description
  2. API Call: Angular sends prompt to Express backend
  3. Gemini API: Generates high-quality content
  4. Display: Angular shows the markdown-formatted result
  5. Extras: History, Favorites, Copy to Clipboard, Dark Mode

✨ Highlight Features

🗂️ Prompt Styles

Choose from:

  • Creative
  • Formal
  • Funny
  • Concise

Each style is prepended to the user’s input to guide the AI.

const styledInput = `Generate a ${style.toLowerCase()} prompt for: ${userInput}`;
Enter fullscreen mode Exit fullscreen mode

📜 Markdown Rendering with ngx-markdown

Gemini returns responses with headings, lists, and bold text. To display this nicely, we use ngx-markdown:

<markdown [data]="generatedPrompt"></markdown>
Enter fullscreen mode Exit fullscreen mode

💡 Favorites & History with localStorage

Save your best prompts or revisit recent ones!

localStorage.setItem('favorites', JSON.stringify(this.favoritePrompts));
Enter fullscreen mode Exit fullscreen mode

🌓 Dark Mode with Tailwind + Signals

Using Angular Signals and Tailwind dark classes:

toggleDarkMode() {
  this.darkMode.update(d => !d);
  document.documentElement.classList.toggle('dark', this.darkMode());
}
Enter fullscreen mode Exit fullscreen mode

🔐 Backend: Express + Gemini API

Secure route that proxies requests to Gemini:

app.post('/generate-content', async (req, res) => {
  const { userPrompt } = req.body;
  const response = await ai.models.generateContent({
    model: 'gemini-2.5-flash',
    contents: userPrompt,
  });
  res.json({ text: response.text });
});
Enter fullscreen mode Exit fullscreen mode

Hosted via Vercel with a vercel.json config.


🌍 Deployment Options

  • ✅ Frontend: GitHub Pages / Vercel / Netlify
  • ✅ Backend: Vercel Serverless Function or Railway
  • ✅ Environment: Environment-safe with .env & build-time env usage

📸 Screenshots

(Add screenshots of UI, light/dark modes, mobile view, etc.)


🧩 Future Enhancements

  • ✍️ Prompt Templates
  • 🔐 Login with Google (Firebase Auth)
  • 📊 Prompt analytics
  • 🗃️ Export to PDF or Markdown

📜 License

MIT © Manthan Ankolekar


🙌 Final Thoughts

Building this project was a great exercise in combining the power of:

  • Angular Standalone Components + Signals
  • Tailwind’s dark mode utility
  • AI text generation via Google Gemini
  • And clean local storage features

🔗 Check it out: prompt-generator-application.vercel.app
📦 View the Code: github.com/manthanank/prompt-generator


Comments 0 total

    Add comment