If you're looking to explore Generative AI in your Angular apps, this blog is for you! Learn how I built a modern, full-featured Prompt Generator using Angular, Tailwind CSS, and the Gemini API—with a secure Node.js/Express backend to keep things safe.
🌟 What Is the Prompt Generator?
The Prompt Generator is a web app that allows you to:
✅ Generate creative, formal, funny, or concise AI prompts
✅ Copy or save prompts to favorites
✅ View history of generated prompts
✅ Toggle between light and dark mode
✅ Enjoy a responsive and modern UI with Tailwind CSS
✅ Use a secure backend to protect your API key
Live Demo: prompt-generator-application.vercel.app
GitHub Repo: https://github.com/manthanank/prompt-generator
🏗️ Tech Stack
- Frontend: Angular 20+, Tailwind CSS 4, ngx-markdown
- Backend: Node.js + Express
-
AI Engine: Google Gemini Pro (via
@google/genai
SDK)
🔒 Why Use a Backend?
Google's Gemini API requires your API key. Exposing it in frontend code is risky. That's why this app routes all AI requests through a secure Express server, hosted on Vercel.
📁 Project Structure
prompt-generator/
├── backend/ → Express API for Gemini
├── src/ → Angular frontend
│ ├── app/ → Standalone Angular components
│ ├── environments/ → Environment config
│ └── styles.css → Tailwind + Dark mode styles
🔧 Setup & Installation
1. Clone the Repository
git clone https://github.com/manthanank/prompt-generator.git
cd prompt-generator
2. Install Frontend & Backend Dependencies
npm install
cd backend
npm install
3. Configure the Environment
Create a .env
file inside backend/
:
GEMINI_API_KEY=your_gemini_api_key_here
PORT=3000
4. Start Development Servers
# In backend/
npm start
# In frontend (root)
npm start
🧠 How It Works
🧾 Prompt Generation Flow:
- User Input: Choose style + enter description
- API Call: Angular sends prompt to Express backend
- Gemini API: Generates high-quality content
- Display: Angular shows the markdown-formatted result
- Extras: History, Favorites, Copy to Clipboard, Dark Mode
✨ Highlight Features
🗂️ Prompt Styles
Choose from:
- Creative
- Formal
- Funny
- Concise
Each style is prepended to the user’s input to guide the AI.
const styledInput = `Generate a ${style.toLowerCase()} prompt for: ${userInput}`;
📜 Markdown Rendering with ngx-markdown
Gemini returns responses with headings, lists, and bold text. To display this nicely, we use ngx-markdown
:
<markdown [data]="generatedPrompt"></markdown>
💡 Favorites & History with localStorage
Save your best prompts or revisit recent ones!
localStorage.setItem('favorites', JSON.stringify(this.favoritePrompts));
🌓 Dark Mode with Tailwind + Signals
Using Angular Signals and Tailwind dark classes:
toggleDarkMode() {
this.darkMode.update(d => !d);
document.documentElement.classList.toggle('dark', this.darkMode());
}
🔐 Backend: Express + Gemini API
Secure route that proxies requests to Gemini:
app.post('/generate-content', async (req, res) => {
const { userPrompt } = req.body;
const response = await ai.models.generateContent({
model: 'gemini-2.5-flash',
contents: userPrompt,
});
res.json({ text: response.text });
});
Hosted via Vercel with a vercel.json
config.
🌍 Deployment Options
- ✅ Frontend: GitHub Pages / Vercel / Netlify
- ✅ Backend: Vercel Serverless Function or Railway
- ✅ Environment: Environment-safe with
.env
& build-time env usage
📸 Screenshots
(Add screenshots of UI, light/dark modes, mobile view, etc.)
🧩 Future Enhancements
- ✍️ Prompt Templates
- 🔐 Login with Google (Firebase Auth)
- 📊 Prompt analytics
- 🗃️ Export to PDF or Markdown
📜 License
MIT © Manthan Ankolekar
🙌 Final Thoughts
Building this project was a great exercise in combining the power of:
- Angular Standalone Components + Signals
- Tailwind’s dark mode utility
- AI text generation via Google Gemini
- And clean local storage features
🔗 Check it out: prompt-generator-application.vercel.app
📦 View the Code: github.com/manthanank/prompt-generator