Description:
Are you passionate about building intelligent systems and scalable backends that power next-gen digital experiences? We're looking for a Python Backend Developer with a strong foundation in AI/ML, data engineering, and backend frameworks like Django, FastAPI, or Flask—plus hands-on experience working with TensorFlow, LLMs, and cloud-native environments.
If you thrive on solving complex problems and love playing with models, APIs, and automation, then you’re the talent we’re after!
Responsibilities:
- Backend & AI Integration: Build and maintain scalable backend systems using Python, integrating machine learning models (TensorFlow, PyTorch, etc.) and LLM pipelines (e.g., RAG) into production.
- Data Engineering: Handle large-scale data workflows using Pandas, NumPy, PySpark, and Matplotlib for processing, cleaning, and visualizing data.
- Web Scraping & Automation: Develop automation scripts using Selenium and BeautifulSoup to collect and process unstructured data from the web.
- Model Training & Deployment: Train, fine-tune, and deploy ML models using TensorFlow, and optimize them for performance and scalability.
- API Development: Build RESTful and asynchronous APIs using Django REST Framework, FastAPI, or Flask to expose AI-powered services.
- Frontend Collaboration: Integrate with React and Angular teams for full-stack delivery and seamless product experiences.
- Security & Auth: Implement secure authentication mechanisms (JWT, OAuth) and ensure best practices in backend security and access control.
- Infrastructure & DevOps: Work with Docker, Linux/Unix, Bash scripting, and cloud platforms (AWS, GCP, Azure) for containerization, automation, and scalable deployment.
- CI/CD & Testing: Write clean, testable code using PyTest/UnitTest, and contribute to DevOps pipelines.
Requirements:
- 1.5+ years of backend development experience using Python.
- Strong experience with AI/ML frameworks (TensorFlow, PyTorch), LLMs (e.g., OpenAI, Hugging Face), and RAG pipelines.
- Deep knowledge of SQL/NoSQL databases (PostgreSQL, MySQL, MongoDB, Redis).
- Familiarity with async processing using Celery, AsyncIO, and microservices architecture.
- Hands-on experience with Selenium, BeautifulSoup, and web automation.
- Comfort with data pipelines and big data tools like PySpark.
- Fluency in Linux environments, shell scripting, and containerized deployments are plus.
- Team player with strong communication skills and adaptability in agile environments.