The Data and Infrastructure Engineer is responsible for two main aspects. First, organizing, collecting, processing, and storing data from different sources and transforming raw, unstructured data into formats that can be used for analysis, including architectures such as databases, servers, and large-scale processing systems. Second, the role involves the design, planning, management, maintenance, and support of an organization's cloud computing environment, including support of applications on any Schlumberger-supported cloud.
Responsibilities:Managing cloud environments in accordance with company security guidelines.Acting as a focal point for gathering system requirements from application architects and owners to ensure seamless transformation and loading of data across different cloud vendors such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform.Data loading, data migration, and data retrieval.Advising on data governance, policies, roles, and data flow.Assembling large, complex sets of data that meet non-functional and functional business requirements.Performing data mapping and data quality control for one to multiple data sources.Running assessments of a company's data and infrastructure environment with proposed improvements and recommendations.Working with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues.Using API knowledge to design RESTful services and integrate them with existing data providers, using JSON or XML as needed.Maintaining existing company databases, with a working knowledge of SQL and NoSQL, and related data stores such as Postgres.Debugging technical issues inside a complex stack involving virtualization, containers, microservices, etc.Collaborating with engineering teams to enable digital applications to run on cloud infrastructure correctly, performantly, and efficiently.Deploying using container technologies: Docker and Kubernetes.Gathering/keeping experience working with OpenStack, Linux/UNIX, Rackspace, Docker, and Microsoft Azure as applicable for the customer and Schlumberger implementation.Having a working knowledge of web services, APIs, REST, and RPC.Handling incoming incidents, change requests, and problems as a first line of support.Operating, designing, developing, testing, and implementing infrastructure systems and hardware with automation in mind.Estimating project value/development plans from an infrastructure point-of-view and maintaining close liaison with the customer.Maintaining accurate and up-to-date documentation on all activities.Participating in out-of-office on-call rotation as required.Installing and configuring IoT Agora solutions including infrastructure, network setup, and SDK implementation.Maintaining and supporting existing on-premises applications from licensing, installation, and configuration if applicable.Gathering/developing experience working on OSDU community code and Schlumberger Enterprise Data Management (EDM) workflows.Minimum Requirements:Bachelor of Engineering, Science in Technology, Geophysics, Geology, or a related discipline, with a strong interest in data management.Minimum 3 years of experience in the oil and gas industry supporting data management activities.Familiarity with various Data Management software, regardless of the platform used.Experience with Petrel software.Knowledge in workflows such as data reception, data ingestion, data quality control, data visualization, and data delivery.Knowledge of industry standards such as OSDU, PPDM, and Energetics.Proficiency in English.Roles and Responsibilities:Providing high-quality domain and software support to different clients, using existing SLB platforms.Delivering training courses internally and to clients.Working with domain specialists from different areas, such as engineering, geology, and petrophysics.Supporting sales activities.
#J-18808-Ljbffr