Avatar   Chief Technology Officer
  TAUS
  Amsterdam Area, Netherlands

Summary

C-Level professional with 15+ experience with a solid technical background and a strong business management skills. Worked for a variety of different industries and performed a range of different roles from functional, managerial, and executive. Holds a Major in Computer Sciences, a Master in Business Administration with specialisation in Enterprise Management and comprehensive set of industry technology certifications and specializations. Proven experience in build high performance agile teams and define lean business processes and strategic thinking.

Notable recent professional experiences are LiDL Germany where worked in build a machine learning pipeline to training models with 1TB+ of information weekly. Models were used to predict quantity pastry articles required to bake for the 3 baking batches for LiDL stores all around the world. This led to savings in waste up to €4mi annually. For European Union ENISA division created artificial intelligence enabled Cyber Security Awareness Machine. This system index cyber security information from different sources and apply NLP and Knowledge Graph techniques to enable query the information in a more human friendly way. Allowing the user to full text search the gathered information and create awareness report to spread throughout European Union.

Currently, holds the Chief Technology Officer position at TAUS coming from Principal Software Engineer. At TAUS led the company data and digital transformation by creating a Data Architecture to support currently over 1TB of TAUS and Customers Language Data with focus in serveless technology. Deployed microservices architecture in the company IT in order to increase business domains scalability in a controlled manner. Defined several business processes to decrease company operational cost. Promoted the culture of knowledge sharing which allowed the company interchange people across projects and needs thus increasing the IT range operation without need to hire more people. Led the digital transformation project which totally revamp TAUS service and website technologies allowing the company to go digital faster than ever before. Continuously defines the right set of services and technologies for IT to use in order to match with the business goals keeping the company competitive and teams up to data with what matters. Helps in mentor team members in new technologies though workshops and PoC’s.

Some Key Skills:
  • Comfortable in programming in Java (main language), Python, Ruby, Java(ECMA)Script, by comfortable means knowledge in the language syntax and semantics as its ecosystem of tools, libraries and best practices.

  • Considerable knowledge in Apple Swift/XCode and overall iOS/iPadOS, and Android ecosystems.

  • Very strong experience with TDD/BDD.

  • Know very well CI/CD and the use of tools like Git, Source Repository Github/lab/bitbucket, Maven/Gradle, Ansible, VM/Containers, Github Actions/Jenkins.

  • Very strong communication skills and strong mentorship skills.

  • Very strong knowledge in "ETL", Data transformation, Data wrangling, Data processing and Data Engineering and technologies for structured and unstructured data like SQL and NOSQL databases, BigData/Hadoop, Streaming Spark/Kafka.

  • Very strong understanding of best practices for Software Security.

  • Very strong understanding in cloud computing and experience in the major ones as AWS and Google Cloud.

  • Very strong knowledge of Machine Learning concept, libraries, tools and overall workflow.

Education

Student Professor/Instructor

Master of Business Administration (MBA)
Fundação Getúlio Vargas São Paulo
2010 - 2011
https://educacao-executiva.fgv.br/

Graduate Studies Professor
Senac São Paulo
2010 - 2011
http://www.sp.senac.br

Bachelor’s in Computer Sciences
Pontifícia Universidade Católica de São Paulo
1999 - 2004
https://www.pucsp.br/graduacao/ciencia-da-computacao

Java and Middleware Certified Instructor
Oracle Brazil
2008 - Present*
https://education.oracle.com/java/java/pFamily_48
* I’m living in Netherlands, back in Brazil I’m still an Oracle instructor

Java/SAP ABAP Certified Instructor
SAP
2004 - 2008
https://www.sap.com

Specializations

Certifications Courses

Certified Web Components for Java Platform
Certified Programmer for Java Platform
Certified Associate for Java Platform

Oracle (former Sun) University, 2006

Hadoop Operations: Hadoop Administration Foundation

Hortonworks/Teradata, 2018

SAP Certified Instructor for SAP Technologies
SAP Netweaver Workbench for ABAP

SAP University and Instructors Program, 2008

MongoDB Aggregation Framework (M121)
MongoDB for DBAs (M102)
MongoDB for Java Developers (M101J)

Mongo University, 2016

COBIT - Auditing IT Systems

Galegale & Associates, 2011

ReST from Scratch
Continuous Delivery (by Martin Fowler and Jez Humble)

QCon Conference Workshops, 2010

Web Application Security Training

Bonsai Information Security, 2010

Introduction to TCP/IP Architecture

Telefonica Brazil, 2006

Professional Experience

TAUS - https://taus.net

Head of Data & Engineering, March 2020 - Present, Amsterdam, The Netherlands

At TAUS, a data company where the main asset is translation data, we have over 35 billion words in several languages and we have platforms that enable translators to produce translation data - https://hlp.taus.net - or sell the translation data - https://datamarketplace.taus.net - they own. We apply NLP technology on all our data to ensure that it’s clean, reliable and ready for Machine Learning Translation and for that we have a robust data pipeline made of several components that ensure the high quality of our data. Also on our business pipeline is planned expansion our data offering for other kinds like annotated data, sounds, voices and more. At TAUS I’m in charge of the whole engineering and data teams, I oversee everything that is created and align our technologies with the business goals but that this really means is that I’m one of actual doers, I don’t just communicate and explain the need but I also design and help to shape how it will look like technically speaking, a very hands on position and attitude. I’m responsible design and implement most of our core Data Architecture entirely based on server less pipelines. I’m the head of a multidisciplinary and multipolyglot teams made of back and front enders and NLP research and analysts, working in different fronts that come together develivering production-ready products.

Teradata - https://www.teradata.com

Lead Software Engineer, November 2017 - March 2020 - Amsterdam, The Netherlands

Responsible to help customer to succeed on BigData and Artificial Intelligence (Deep Learning) projects.

  • Architect and deploy BigData projects based on Hadoop with orchestration and ingestion done by Luigi, Airflow or Kylo/NiFi and consumption with Kafka, Spark and others.

  • Architect 'AnalyticsOps' workflows which automate the process of test, train, deploy and 'champion/challenge' of machine learning models for a considerable amount of different tools and frameworks like, SparkML, SciKitLearn, TensorFlow/Keras, with model deployment of type K5, ONNX, PMML.

Projects

Backschema Prognose
Customer

LiDL Germany - https://www.lidl.com/

Description:

A platform to generate forecast of baking articles through Machine Learning for LiDL stores. The platform generate the forecast of how many baking articles should be baked in a given day, by use Machine Learning techniques was possible to predict ideal quantities while preserving sales and decrease waste. The expected savings in waste decrease is ~4Mi € per year making this a significant project for the company.

Attributions:

We were a multifunctional agile team using scrum to achieve the business goals. Therefore, I acted in different fronts on this project. Key things that I was responsible was:

  • Improve performance on several modules in order to meet SLA’s.

  • Define CI/CD workflows where I create environments like DEV, QA and PROD and defined the release cycle and deployment among those environments.

  • Support the team on how to properly use Hadoop and Spark for more performant and resources friendly consumption.

  • Code reviews

  • Write tests for the platform key modules

Techs:

Python, Hadoop, Spark, ScikitLearn, Pandas, Luigi, Jenkins

Cognitive Platform
Customer

Bankia Spain - https://www.bankia.com/en/

Description:

A platform to allow ingestion, processing and streamline paper forms such as tax submissions, mortgages, loans, etc. The platform receives the scanned document, recognise its the type through machine learning image recognition models, split it apart in different sub images, apply OCR to gather the text and by use other machine learning models it identifies things like ticked check boxes, afterwards with all the information captured send it to the proper Bank system for further processing. This project led to a huge savings in money and time for the Bank by a significant decrease in manual and error-prone processing.

Attributions:

I was responsible for create what was called as a 'Model Management Framework' component whose goal was to create a functional abstraction layer between the data scientists and the whole automated infrastructure. The MMF orchestrated all the steps needed to create a machine model such as testing, training, deploy and releasing. Applying strategies as 'Champion/Challenger' with multi model deployment. Together with the framework a business user interface was created for the data scientist trigger training, deployments and releases for the models managed by the framework.

Techs:

Python, AWS Cloud, Tensorflow, Keras, Jenkis, Bamboo, Nexus, Kafka, HBase, Docker, Ansible

Open CSAM (Open Cyber Security Awareness Machine)
Customer:

ENISA European Union - https://www.enisa.europa.eu/

Description:

Artificial intelligence enabled Cyber Security Awareness Machine. This system index cyber security information from different sources and apply NLP and Knowledge Graph techniques to enable query the information in a more human friendly way. Allowing the user to full text search the gathered information and create awareness report to spread throughout European Union.

Attributions:

I helped to define the base architecture, managed the Elastic Search cluster, implemented the content scrapping web spiders, implemented the user interface and managed as a whole the technical aspects of the project.

Techs:

Python, Elastic Search, Covalent (Angular), NLP, Machine Learning, Knowledge Graph, Docker, Ansible, AWS Cloud.

Backbase - https://www.backbase.com

Chapter Lead/Senior Software Engineer, June 2016 - September 2017 (1 year 4 months), Amsterdam Area, Netherlands

Responsible for lead teams to develop and deploy Backbase Portal projects to customers all over the world. Talk directly to customer senior management and architects in order to understand the requirements and implement those in the product. Also act as trainer and interact with RnD department to help enhance Backbase product as a whole. I executed Backbase projects all around the world like Switzerland, Mexico, United Kingdom, Thailand, Iceland.

From Backbase CXP version 5.x to 6.x the underline architecture changed from monolith to microservices, I helped with this process by creating a library that would allow the version 5.x of the product to be deployed in a micro services architecture.

UOL - https://www.uol.com.br

Project Coordinator and Solution Architect, January 2013 - January 2014 (1 year 1 month), São Paulo Area, Brazil

My main responsibilities is to coach teams, design and implement systems, do trainings, suggest and make prove of concept of new frameworks and tools trying to improve overall productivity of the team that I work with.

Some of my main accomplishments:

  • Managed teams simultaneously delegating tasks and coordinating releases of different projects.

  • Resolved the main problem with one of the projects while leading the team with other tasks.

  • Influenced other teams to use new technologies, practices and tools.

  • Negotiated new features to stakeholders.

  • Reduced about ~30% of the operational cost of the revenue assurance department.

Projects

Commission System
Description

A system to calculate and trigger the order of payment request for the UOL sales partners spread across entire country. This system had to deal with a considerable amount of data which were all orders sold per month by each partner.

Attributions

When I joined this project this billing engine was performing poorly, taking around 6 days to finish the commission calculation. I rewrote the whole calculation engine from scratch in 2 months and by changing completely the commission calculation strategy the total time to finish the calculation was reduced to 2 minutes.

Techs:

Java 5, Maven, JUnit, Spring Framework, Log4J, EhCache, Hibernate, Git, Jetty

Veridcad Fraud Prevention and Analysis System
Description

A rule based system to pre and post validate any subscription attempt of any service within the UOL company with the goal to avoid fraud. The system applies rules like check if the Brazilian Social Security Number (CPF) is valid and active within the Brazilian tax authorities institution, if the CPF is registered within credit protection companies and other internal rules.

Attributions

A coordinated a team of 8 people to deploy this solution. I was responsible to gather the requirements with the project stakeholders and translate them in functional requirements with the team and supervise the overall implementation.

Techs

Java, Maven, JUnit, Spring Framework, Log4J, EhCache, Hibernate, Git, Jetty