A comprehensive, production-ready client for the CARLA autonomous driving simulator featuring realistic vehicle control, sensor fusion, real-time monitoring, and automated deployment pipelines. The project is published on PyPI and Docker Hub with full CI/CD automation.

πŸš— What It Is

This is my most ambitious project to date - a comprehensive client for the CARLA autonomous driving simulator. The project provides a complete solution for running autonomous driving simulations with realistic vehicle control, sensor fusion, and real-time monitoring. It’s designed as a production-ready system with automated deployment, versioning, and distribution.

πŸ› οΈ Technologies Used

Backend & Core

  • Python 3.11 - Core simulation logic and backend services
  • FastAPI - High-performance API for real-time communication
  • PostgreSQL - Data storage for simulation logs and metrics
  • SQLAlchemy - Database ORM and migrations
  • Alembic - Database migration management

Frontend & UI

  • React - Web-based monitoring and control interface
  • Material-UI - Professional UI components
  • WebSocket - Real-time data streaming
  • Chart.js - Real-time data visualization

Infrastructure & DevOps

  • Docker - Containerized deployment for easy setup
  • Docker Compose - Multi-service orchestration
  • GitHub Actions - Automated CI/CD pipeline
  • PyPI - Python package distribution
  • Docker Hub - Container image distribution

Simulation & AI

  • CARLA 0.10.0 - Autonomous driving simulation platform
  • NumPy - Numerical computing and data processing
  • Matplotlib - Data visualization and plotting
  • Pygame - Real-time graphics and input handling

✨ Key Features

Realistic Vehicle Control

  • Physics-based vehicle control systems with accurate dynamics
  • Realistic acceleration, braking, and steering models
  • Support for multiple vehicle types and configurations
  • PID controllers for smooth vehicle operation

Advanced Sensor Fusion

  • Integration of multiple sensor types (LIDAR, cameras, GPS, IMU)
  • Real-time sensor data processing and synchronization
  • Advanced perception algorithms for object detection
  • Kalman filtering for sensor fusion

Professional Web Interface

  • Real-time monitoring dashboard built with React
  • Live visualization of vehicle state and sensor data
  • Interactive controls for manual intervention
  • Performance metrics and analytics dashboard
  • Responsive design for multiple screen sizes

Production-Ready Deployment

  • Complete Docker containerization with multi-stage builds
  • Automated CI/CD pipeline with GitHub Actions
  • Automated versioning based on semantic versioning
  • Health checks and graceful shutdowns
  • Comprehensive logging and monitoring

πŸš€ Fully Automated CI/CD Pipeline

The project features a zero-touch deployment system that automates everything from code commit to production:

Zero-Touch Deployment

  • On every code commit to the master branch, the system automatically:
    • Builds the application with comprehensive testing
    • Versions the release based on commit messages (semantic versioning)
    • Publishes to PyPI and Docker Hub simultaneously
    • Creates GitHub releases with changelogs
    • Deploys to production environments

Automated Versioning

The system uses intelligent versioning based on commit messages:

  • feat: β†’ Minor version bump (1.0.7 β†’ 1.1.0)
  • fix: β†’ Patch version bump (1.0.7 β†’ 1.0.8)
  • BREAKING CHANGE: β†’ Major version bump (1.0.7 β†’ 2.0.0)

Multi-Platform Publishing

Every release is automatically published to:

  • PyPI: Python package distribution
  • Docker Hub: Container image distribution
  • GitHub Releases: Source code and documentation
  • Read the Docs: API documentation updates

Quality Assurance

The pipeline includes:

  • Automated testing on multiple platforms
  • Code coverage reporting
  • Security scanning for vulnerabilities
  • Documentation generation and deployment
  • Duplicate prevention to avoid publishing conflicts

This means zero manual intervention - just push code and everything is automatically built, tested, versioned, and published to all platforms within minutes.

Package Distribution

  • PyPI Package: pip install carla-driving-simulator-client
  • Docker Hub: docker pull akshaychikhalkar/carla-driving-simulator-client
  • GitHub Releases: Automated release management
  • Documentation: Complete API documentation on Read the Docs

πŸ—οΈ System Architecture

The project follows a microservices architecture with clear separation of concerns:

CARLA Project ArchitectureUser Interfaceβ€’ React Frontendβ€’ Real-time Dashboardβ€’ Vehicle Controlsβ€’ Sensor VisualizationAPI Gatewayβ€’ FastAPI Backendβ€’ WebSocket Serverβ€’ REST Endpointsβ€’ AuthenticationCARLA Simulatorβ€’ Unreal Engineβ€’ Python APIβ€’ Sensor Dataβ€’ EnvironmentData Processing Layerβ€’ Sensor Data Processingβ€’ Image Recognitionβ€’ Object Detectionβ€’ Path Planningβ€’ Decision MakingStorageβ€’ Session Dataβ€’ Logsβ€’ AnalyticsMonitoringβ€’ Performance Metricsβ€’ System Healthβ€’ AlertsWebSocket

πŸ“¦ Installation & Deployment

pip install carla-driving-simulator-client
carla-simulator-client

From Docker Hub

# Pull the latest image
docker pull akshaychikhalkar/carla-driving-simulator-client:latest

# Run with Docker Compose (recommended)
git clone https://github.com/akshaychikhalkar/carla-driving-simulator-client.git
cd carla-driving-simulator-client
docker-compose -f docker-compose-prod.yml up -d

From Source

git clone https://github.com/akshaychikhalkar/carla-driving-simulator-client.git
cd carla-driving-simulator-client
pip install -e .

πŸ”§ Technical Challenges

Sensor Integration Complexity

Integrating multiple sensor types (LIDAR, cameras, GPS) and processing their data in real-time was one of the biggest challenges. Each sensor has different data formats, update frequencies, and coordinate systems that needed to be synchronized.

Real-Time Performance Optimization

The web interface needed to display sensor data and vehicle state in real-time without lag. This required careful optimization of data transmission, efficient rendering of 3D point clouds, and WebSocket connection management.

Physics Simulation Accuracy

Implementing realistic vehicle physics that accurately represents real-world behavior was crucial for meaningful simulation results. This involved understanding vehicle dynamics, tire models, and suspension systems.

Production Deployment

Making the system production-ready involved Docker containerization, database migrations, monitoring integration, and security considerations for web interfaces.

🎯 What I Learned

Autonomous Driving Systems

  • Understanding the complexities of autonomous vehicle systems
  • Sensor fusion and real-time data processing techniques
  • Physics-based vehicle modeling and control algorithms
  • Path planning and obstacle avoidance strategies

Real-Time Systems Development

  • High-performance data processing and visualization
  • WebSocket communication for real-time updates
  • Optimizing performance for latency-sensitive applications
  • Memory management for large datasets

Microservices Architecture

  • Designing scalable, modular systems
  • API design and documentation best practices
  • Service communication and data flow management
  • Database design for complex sensor data

DevOps & Production Deployment

  • Docker containerization and orchestration
  • CI/CD pipeline automation with GitHub Actions
  • Package distribution on PyPI and Docker Hub
  • Automated versioning and release management
  • Monitoring and logging in distributed systems

πŸš€ Future Enhancements

  • Multi-vehicle Support - Running multiple autonomous vehicles simultaneously
  • Advanced AI Integration - Incorporating more sophisticated AI models for perception
  • Cloud Deployment - Scaling to cloud infrastructure for distributed simulations
  • Extended Sensor Support - Adding support for more sensor types and configurations
  • Kubernetes Deployment - Moving to Kubernetes for better orchestration
  • Advanced Analytics - Real-time analytics and machine learning integration

πŸ“Š Project Impact

This project has been instrumental in my understanding of autonomous driving systems and real-time software development. It combines multiple complex technologies into a cohesive, user-friendly system that can be used for research, education, and development of autonomous driving algorithms.

The project demonstrates:

  • Production-Ready Quality: Automated testing, CI/CD, and deployment
  • Community Accessibility: Published on PyPI and Docker Hub
  • Professional Standards: Comprehensive documentation and versioning
  • Scalable Architecture: Microservices design for future growth

This project represents my passion for cutting-edge technology and my commitment to building systems that can have real-world impact. The journey from concept to production deployment demonstrates the importance of proper architecture, automation, and community accessibility. πŸš—