Welcome to My Portfolio

I’m Paolo Fullone, a trilingual backend software engineer specializing in C# and .NET. With a passion for scalable microservices and robust APIs, I build systems that handle millions of operations daily, from real-time data streams to high coverage test suites. My 20+ years in procurement across Brazil and Mexico bring leadership and precision to every project, while fluency in English, Spanish, and Portuguese fuels global collaboration.

This portfolio showcases my most impactful work projects like optimizing Kafka clusters or streamlining stock data with SignalR. Each reflects my drive to solve complex challenges with clean code and innovative architecture. Explore them to see how I blend technical expertise with real-world results, and connect with me on LinkedIn or GitHub to dive deeper.

Trades GraphQl API

A Practical Guide

Why GraphQL Excels for Querying Large, Complex Datasets: A Real-World Trading API Case Study

Handling large, complex datasets efficiently is a critical challenge in modern APIs. In a real-world project, I built a .NET GraphQL API integrated with SignalR to retrieve and stream thousands of trades from a database in real time. The Trades class, with approximately 100 properties (e.g., trade ID, price, volume, timestamp, counterparty details), represents a rich dataset where different domains (e.g., analytics, reporting, trading desks) require specific subsets of data. Previously, I illustrated GraphQL’s benefits with a simplified task management API using a WorkTasks table. Here, I’ll show how those benefits apply to this real-world trading API, enabling precise, scalable, and flexible querying of trade data.

[Read More]

Health Checks

A Practical Guide

In a microservices architecture, ensuring each service communicates reliably with databases like SQL Server and Oracle is critical but challenging due to restricted permissions. Health checks monitor service health, and in this post, I focus on advanced checks that execute custom queries to verify table access, beyond standard NuGet packages. I’ve learned to anticipate issues like permission errors, ensuring robust systems for global teams.

I developed a MultiDatabase health check to test specific queries across SQL Server and Oracle, addressing the challenge of limited database access rights that require DBA intervention (e.g., user creation, encrypted connection strings). This check integrates with CI/CD pipelines, catching issues before deployment, complementing my 95% test coverage in projects like RunningTracker.

[Read More]

RunningTracker API

Mastering Comprehensive Testing

Inspired by my running hobby, I built a .NET 8 RESTful API to track amateur runs, focusing on testing mastery. Using Dapper, Docker Compose, and SQL Server with XUnit, Reqnroll, and K6, ensuring robust endpoints.

My procurement-honed discipline—20+ years managing complex projects—drove rigorous testing, a skill I apply to production systems.

This API, inspired by my running hobby, honed my testing skills in Unit, Integrated, Functional and Load tests. Streamlined testing process for maintainable code, applicable to production systems.

[Read More]

SignalR for Real-Time Stock Data

Streaming vs. Paginated APIs

During the development of this project, I was moving a critical system from .NET Framework to .NET 8 in a move to cloud initiative. The legacy system was not designed to be cloud-native and didn’t have the tests I spoke about in the Running Tracker project.

This new Web API was a part of the monolith responsible for retrieving from database all the negotiations made by a number of clients and confirmed in the B3 (Brazilian Stock Exchange), the data was processed to calculate taxes and commissions.

[Read More]

Kafka for Microservices

Scaling Financial Systems

I helped migrate a .NET monolith to microservices, processing 4-5 million daily operations for financial advisors. Using Kafka and Kubernetes, I streamlined complex commission rules. My 20+ years in procurement enabled me to align teams and deliver tough updates. Despite delays, my refactoring ensured cloud readiness.

The application was built using .NET and was running on a monolithic architecture on a Virtual Machine.

We decided to use Kafka as our message broker to decouple the services and allow for asynchronous communication between them.

[Read More]