💼 Hiring Quest – .NET Engineer (3 YOE) @ ElectroPi

Phase: Registration

Registration Deadline: October 31, 2025

Submission Deadline: November 8, 2025

To register for a quest, you need to create an account on our platform . If you've registered for any quest before, you already have an account. If you face any issues, please contact us on WhatsApp at 01558405326 or join our WhatsApp Community.

Register Now

Prizes

You get hired with paid contract and the opportunity to work on real-world .

👋 We are ElectroPi, empowering businesses to unlock the full potential of AI by providing customized solutions that address their unique challenges.
Whether it's optimizing workflows, enhancing decision-making, or driving innovation, our advanced AI tools and services are tailored to meet the specific needs of various industries.

We’re now hiring a .NET Engineer to join our on-site team and help us build the backend systems that power our intelligent automation and AI platforms.

🕓 Start Date: 1 November 2025
💰
Salary Range: 30,000 – 35,000 EGP
💼
Contract Type: Full-time -  on-site


🛠️ How the Hiring Quest Works

  1.  Register for the quest

  2. Receive the full challenge after registration closes

  3. Submit your solution before the deadline

  4. Top candidates are invited to a technical review session

  5. One candidate will be hired


🔍 Who We’re Looking For

  1. ✅ 1–3 years of experience with .NET 6+/7+/8+

  2. ✅ Good knowledge of ASP.NET Core Web API

  3. ✅ Understanding of Entity Framework Core and SQL databases

  4. ✅ Experience with JWT authentication

  5. ✅ Comfortable building and documenting REST APIs

  6. ✅ Familiar with clean, modular code and async programming

  7. 💡 Bonus: Some experience with Docker, Redis, or unit testing


🎯 Your Mission: “Web Analytics Data Aggregator”

🧠 Business Context

ElectroPi helps clients use data to make smart decisions. In this quest, you’ll build a small backend system that reads data from two online tools, processes it through a real message broker, and exposes a simple reporting API.

Tools (mocked via JSON files):

  1. Google Analytics (GA) → website traffic data (users, sessions, views)

  2. PageSpeed Insights (PSI) → performance data (scores, timings)

You do not need real GA/PSI APIs — mock with JSON files.


📌 The Challenge (Queue is Mandatory)

  1. Step 1 – Read Data (Ingestion)

    1. Create two adapters/services that read JSON files.

      1. GA mock (example): { "date": "2025-10-20", "page": "/home", "users": 120, "sessions": 150, "views": 310 }

      2. PSI mock (example):{ "date": "2025-10-20", "page": "/home", "performanceScore": 0.9, "LCP_ms": 2100 }

    2. Combine into a standard record:

      { "page": "/home", "date": "2025-10-20", "users": 120, "sessions": 150, "views": 310, "performanceScore": 0.9,"LCP_ms": 2100}

  2. Step 2 – Publish to a Real Message Broker (Required)

    1. Use RabbitMQ or Apache Kafka.

    2. In-memory queues are not allowed.

    3. Publish each combined record to a topic/queue:

      1. Kafka: topic analytics.raw

      2. RabbitMQ: exchange analytics.raw (fanout/direct) and a queue analytics.raw.q

  3. Step 3 – Process & Aggregate (Background Consumer)

    1. Create a background worker that consumes from the broker and aggregates:

      1. Per day:

        1. totalUsers, totalSessions, totalViews

        2. avgPerformance (mean of performanceScore)

      2. Persist with EF Core to your SQL DB.

        1. Aggregated example:{"date": "2025-10-20","totalUsers": 480,"totalSessions": 550, "totalViews": 1200, "avgPerformance": 0.88}

        2. Basic reliability: Acknowledge only after successful save; include simple retry (e.g., 3 attempts with backoff).

          If using RabbitMQ, you may use a dead-letter queue analytics.dlq (bonus).

  4. Step 4 – Reporting APIs (JWT-Protected)

    1. GET /reports/overview → totals across all pages & dates

    2. GET /reports/pages → grouped by page (with per-page totals/averages)

  5. Auth: Basic email/password signup/login + JWT (Bearer) for report endpoints.


🗄️ Database Design (Suggested)

  1. Users: Id, Name, Email, PasswordHash, CreatedAt

  2. RawData: Id, Date, Page, Users, Sessions, Views, PerformanceScore, LCPms, ReceivedAt

  3. DailyStats: Id, Date, TotalUsers, TotalSessions, TotalViews, AvgPerformance, LastUpdatedAt


✅ Acceptance Checks (Queue Enforcement)

To be considered complete, your submission must include:

  1. Docker Compose that starts API + DB + Broker (RabbitMQ or Kafka).

  2. A producer service that publishes to the real broker.

  3. A consumer background service that reads from the broker and writes to DB.

  4. Clear logs showing:

    1. messages publishedconsumedsaved

    2. retry attempts on transient failures

  5. Swagger showing the secured report endpoints.

  6. A quick seed script or pre-bundled JSON mock files to generate sample data.


🎁 Bonus Points (Optional)

  1. ✨ Docker Compose healthchecks & wait-for scripts

  2. ✨ Dead-letter queue (DLQ) with reason captured

  3. ✨ Unit tests (adapters, aggregator)

  4. ✨ Minimal frontend page to display reports

  5. ✨ Metrics endpoint (e.g., /health, /metrics)

  6. ✨ README diagram of flow (Producer → Broker → Consumer → DB → API)


🧰 Tech Stack

  1. Backend: .NET 8 + ASP.NET Core Web API

  2. Database: SQL Server or PostgreSQL (EF Core)

  3. Broker (Required): RabbitMQ or Kafka

  4. Auth: JWT

  5. Docs: Swagger / OpenAPI

  6. Runtime: Docker Compose


📝 What You Should Submit

📂 GitHub Repository with:

  1. Organized code (API, Services, Data, Models, Background worker)

  2. docker-compose.yml (API + DB + Broker)

  3. README.md with setup steps:

    1. docker compose up -d

    2. how to seed/read JSON

  4. how to hit Swagger + get a JWT

  5. Optional ARCHITECTURE.md (flow diagram & key decisions)

📹 10-Minute Video

  1. 🎥 3 min — Introduce yourself + two technical challenges you’ve solved

  2. ⚙️ 7 min — Show the end-to-end flow (publish → consume → DB → API demo)


📊 Evaluation Criteria

  1. Code Quality & Structure (clean architecture, separation of concerns) 30%

  2. Completeness & Correctness (end-to-end through the real broker) 25%

  3. API Design & Documentation (clear models, Swagger, auth) 20%

  4. Database Design (sensible schema, EF migrations) 15%

  5. Reliability & Error Handling (retries, acks, idempotency basics) 10%

  6. Bonus: Docker polish, DLQ, tests, simple UI, metrics.


📩 After Submission

Top candidates will be invited to a technical review session.
👉 Final hiring decision within
3–5 business days after the review.


Making the world a better place through competitive crowdsourcing programming.