I’m a senior backend engineer and architect with a passion for scaling to the moon.
A programmer from childhood, an engineer for 7 years, and an architect for 2.
I love to take an application from ideation to a production grade deployment with the best tools for the job, focusing on extreme scale.
Working as the senior engineer at an early stage ad-tech startup. Architecting and developing features and services with a focus on high performance Go, highly scalable infrastructure, extremely low latencies (>10ms) and extremely high scale (<1M RPS).
Leading the Labs efforts on the Backend of OpenWeb. Working as a "Special Forces Engineer" to handle highly advanced projects in short timeframes. Helping push the technology of the OpenWeb Backend Infrastructure to new levels.
Lead a team of developers at OpenWeb (formerly Spot.IM). Developed and designed the OpenWeb Moderation 2.0 platform, which handles ingesting, tagging, and moderating all of the user generated content of the platform at high scale, with a focus on speed, reliability, and efficiency.
Worked as the senior engineer on the Engineering team at a social engagement platform company. Developed new services in Ruby, Elixir, and Golang, and helped design the 2.0 architecture of the company utilizing event sourcing designs. Maintained all legacy services of the company, and led the rewrites of various legacy components into the brand new architecture.
Worked on the Research and Analysis team at a product intelligence company. Developed new and innovative models to extract insights from scientific, social, and market data in various industries, as well as developing new tools to be used both internally and externally within our product.
Worked on the Research and Analysis team and Professional Services team at a product intelligence company. Developed various template frameworks in the Tableau data visualization environment, as well as developing a classification algorithm.
Worked as an independent consultant for a realty investments company. Developed a system for automating tax-reporting processes for the company to the federal government.
Worked as an in-house software developer at an IT Services company. Developed various reporting tools, automation scripts, proof-of-concept projects, and custom applications for internal company use.
Worked as an independent consultant for a recruiting startup company. Developed several programs including data crawlers and software integration for company use.
A proprietary platform for OpenWeb (formerly Spot.IM). The mission was to build a complete infrastructure for ingesting, tagging, and moderating millions of pieces of user generated content per minute across all of the OpenWeb platform. The project was build with high scale, reliability, and efficiency in mind.
The project was built using an event sourced architecture with several microservices managing different parts of the system. It was written in Go, utilizing SQS and Kafka as event queues, and deployed on Kubernetes. After the creation of the system, a case study was done with Google's Jigsaw team on the effect of nudges and real-time moderation on users.
View Case StudyA project built as a personal endeavor, to build a high speed web server in pure Elixir.
The project began with me wanting to learn how to build a reactor designed web server that can overtake the TechEmpower benchmarks for Elixir and Erlang's current benchmarks. It expanded a lot as I learned how the entire HTTP protocol worked, and I am currently working on expanding it into a complete webserver.
View ProjectA proprietary classification algorithm for Signals Analytics. The project entailed designing and developing a Natural Language Processing algorithm to take massive amounts of data points, each with its own text description, and then classify what each data point is talking about, based on a set of rules provided.
After the application was built as a monolith, it was then broken it up into decoupled microservices to deploy as highly scalable cloud-based workers. The project is developed primarily in C# and ASP.NET Web API, using ElasticSearch as a search algorithm, and RabbitMQ as a task queue.
Several proprietary models for extracting analytics and insights designed and engineered for Signals Analytics. This project entailed analyzing scientific, social, and market data in various industries, and developing models in Tableau to extract valuable insights from this data to deliver to clients.
In addition to designing analytics models, complex visualizations were developed to display data in new and innovative ways, creating algorithms to solve specific problems for the company, as well as developing new features for the company’s platform. The project is developed primarily in Tableau, using Tableau’s internal SQL-like language.