Carbon Nigeria Job Recruitment for Data Engineering- How to Apply– Carbon is a pan-African digital bank with a mission to provide friction-free finance to its customers. Carbon promises to play a fundamental role in its customers’ lives wherever they are, with flexible solutions.
We pride ourselves on our efficiency and with just $10mm of equity raised in 2015, we have disbursed over $100m in loans and earned more than $30mm in revenue over the last 2 years. Carbon has operations in Ghana, Kenya, and Nigeria, supported by a talented team spread between Lagos, Nairobi, London, Argentina, and Palo Alto so we operate with a remote-first mindset.
We invite applications from interested and qualified candidates to fill the position below:
Job Title: Data Engineering
Employment: Type Full-Time
Department: Business Intelligence
What are we looking for?
- This role will work very closely with data scientists and business intelligence analysts to build solutions that enable Data Science and Business Intelligence teams to create robust data products, and other Carbon departments to consume these services.
- We are looking for someone who can transform data into a format that can be easily analyzed and create the necessary connections to enable company units to consume transformed data.
- Candidates would have the ability to transform available raw data into formatted data through computer programming and database query.
- Data Engineering will be part of the Business Intelligence team.
Duties and Responsibilities
- Manipulating database management systems (DBMS).
- Create and maintain optimal data pipeline architecture.
- Implementing feature generation and machine learning models scripts.
- Developing, maintaining and testing the infrastructure to transform the data that feeds our dashboards and machine learning models.
- Running complex queries over data.
- Coding, testing and troubleshooting APIs/ endpoints to enable other company assets to integrate to and consume data science and business intelligence products, such as credit risk or fraud scores.
- Creating the necessary connections to expose transformed data to other systems or tools.
Minimum Qualifications and Requirements
We are looking for candidates who can meet the following criteria – We want to emphasize that we don’t expect you to meet all of the below but would love you to have experience in most of the areas:
- Candidates should possess a Bachelor’s or Master’s Degree in a technical field like Computer Science, Engineering with 3+ years of professional experience
- Ability to understand simple business problems
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and working familiarity with various databases.
- Experience with relational SQL and NoSQL databases
- Experience designing, building, and testing API connections
- Experience with data pipelines
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience working with container technology, such as Docker files, Docker images, and GitHub repositories
- Experience with Talend (or any other ETL tool such as SSIS or Pentaho).
- Experience with Google Cloud Platform
- Experience with Tableau, Power BI, or BI Tools.
- A great and upbeat work environment populated by a multinational team.
- Potential to work in different geographies.
- Health Insurance.
- Life Insurance
- Career development & Growth.
- We offer a remote working option.
- Call with People team
- Case Study ( Assessment)
How to get latest job updates regularly and on time
kindly Subscribe to the site using your Email address to get updates for FREE
Also follow our twitter News Handles for latest updates twitter.com/RecruitmentSlot
In case of any issue or challenge or questions click here
Leave a Reply