Big Data Engineer job at Standard Focus Ltd

Advertisements


  • Full Time
  • NAIROBI

Standard Focus Ltd

Big Data Engineer

Advertisements

General Duties & Responsibilities
• Working across a number of business areas providing development, maintenance and support
• Working as part of a team and occasionally solo developments as the business needs arise
• Responsible for the building, deployment, and maintenance of mission critical analytics solutions that process data quickly at big data scales
• Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading across multiple game franchises.
• Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members.
• Interacts with engineering teams and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability.
• Works directly with business analysts and data scientists to understand and support their use cases
• Performs development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions.
• Designing and maintaining data systems and databases; this includes fixing coding errors and other data-related problems.
• Mining data from primary and secondary sources, then reorganizing said data in a format that can be easily read by either human or machine.
• Using statistical tools to interpret data sets, paying particular attention to trends and patterns that could be valuable for diagnostic and predictive analytics efforts.
• Demonstrating the significance of their work in the context of local, national, and global trends that impact both their organization and industry.
• Preparing Dashboards as a Self-Service BI for executive leadership that effectively communicate trends, patterns, and predictions using relevant data.
• Collaborating with programmers, engineers, and organizational leaders to identify opportunities for process improvements, recommend system modifications, and develop policies for data governance.
• Creating appropriate documentation that allows stakeholders to understand the steps of the data analysis process and duplicate or replicate the analysis if necessary.
• Select appropriate datasets and data representation methods
• Keep abreast of developments in the field
• Help identify probable causes and provide immediate solution during an incident
• Work within an agile environment following an agile framework.
• Contribute significant ideas for making the applications better and easier to use
• Participate in cutting edge research in artificial intelligence and machine learning applications.
• Contribute to engineering efforts from planning and organization to execution and delivery to solve complex, real-world engineering problems.

Skills and Experience
• Preferred skills in noSQL, Pig, Matlab, SAS, Java, Ruby, C++, Perl, and APIs to work with available data sources
• Evaluate and improve all aspects of our existing ETL system
• Experience with big data tools and architectures, such as Cloudera Hadoop, HDFS, Hive, BigQuery, Snowflake and Spark.
• Experience in Python programming language and frameworks such as Flask, AIOhttp
• Understanding of data structures, data modelling and software architecture
• Advanced knowledge of SQL queries
• Working knowledge of telematics interfaces and streaming solutions (MQTT, NiFi, Kafka, etc.).
• Experience with Cloud platforms such as Google cloud platform, AWS would be plus
• Ability to work in a team
• Outstanding analytical and problem-solving skills
• Experience in multi-threading, message queues, WebSockets
• Experience with automation of the development and test processes through CI/CD pipeline (Gitlab, SonarQube, Artifactory, Docker containers)
• Intermediate knowledge of API development
• Proving track record in building high performance, highly available and scalable systems

Advertisements


Job Qualifications
• Bachelor’s degree in Computer Science, or related technical field, or equivalent work experience.
• 3 – 5 years of relevant work experience.
• Experience designing and implementing distributed software systems (e.g Python).
• Research or Industry experience in Big Data, Artificial Intelligence, Machine Learning (ML) models, ML infrastructure, Natural Language Processing or Deep Learning.
• Good oral and written English communication skills
• Strong grasp of established and emerging technologies, systems, platforms, and software
• Ability to organize and manage multiple priorities
• Technical curiosity – Willingness to explore and learn new technologies that are unfamiliar
• Ability to work in a fast pace delivery oriented environment
• Ability to deliver short term results while invest in long term strategic solutions
• Self-starter, and Self-motivated and able to learn independently
• Team player who is eager to help others to succeed

Advertisements

To apply for this job please visit www.linkedin.com.

Advertisements

%d bloggers like this: