The Data and Infrastructure Engineer is responsible for two main aspects. First, organizing, collecting, processing, and storing data from different sources and transforming raw, unstructured data into formats that can be used for analysis, including architectures such as databases, servers, and large-scale processing systems. Second, this role involves the design, planning, management, maintenance, and support of an organization's cloud computing environment, including support of an application on any Schlumberger-supported cloud.Managing cloud environments in accordance with company security guidelines.Act as focal point for gathering system requirements from application architects and owners to ensure seamless transformation and loading of data across different cloud vendors such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform.Responsible for data loading, data migration, and data retrieval.Be a customer advisor on data governance, policies, roles, and data flow.Responsible for assembling large, complex sets of data that meet non-functional and functional business requirements.Perform data mapping and data quality control for one to multiple data sources.Run assessments of a company's data and infrastructure environment with proposed improvements and recommendations.Work with stakeholders including data, design, product, and executive teams, assisting them with data-related technical issues.Use API knowledge to design RESTful services and integrate them with existing data providers, using JSON or XML as needed.Maintain existing company databases, which includes a working knowledge of SQL and NoSQL, and related data stores such as Postgres.Responsible for debugging technical issues inside a complex stack involving virtualization, containers, microservices, etc.Collaborate with engineering teams to enable digital applications to run on cloud infrastructure correctly, performantly, and efficiently.Responsible for deployment using container technologies: Docker and Kubernetes.Gather and keep experience working with OpenStack, Linux/UNIX, Rackspace, Docker, and Microsoft Azure as applicable for the customer and Schlumberger implementation.Have a working knowledge of web services, APIs, REST, and RPC.Handle incoming incidents, change requests, and problems as a first line of support.Operate, design, develop, test, and implement infrastructure systems and hardware with automation in mind.Estimate project value and development plans, evaluating from an infrastructure point-of-view, and maintain close liaison with the customer.Maintain accurate and up-to-date documentation on all activities.Participate in out-of-office on-call rotation as required.Install and configure IoT Agora solutions including infrastructure, network setup & SDK implementation.Maintain and support existing on-premises applications from licensing, installation, and configuration if applicable.Gather and develop experience working on OSDU community code and Schlumberger Enterprise Data Management (EDM) workflows.Minimum Requirements:Bachelor of Engineering, Science in Technology, Geophysics, Geology, or a related discipline, with a strong interest in data management.Minimum 3 years of experience in the oil and gas industry supporting data management activities.Familiarity with various Data Management software, regardless of the platform used.Use of Petrel software.Knowledge in workflows such as data reception, data ingestion, data quality control, data visualization, and data delivery.Knowledge of industry standards, such as: OSDU, PPDM, and Energetics.Proficiency in English.Roles and Responsibilities:Provide high-quality domain and software support to different clients, using existing SLB platforms.Deliver training courses internally and to clients.Work with domain specialists from different areas, such as engineering, geology, petrophysics, etc.Support sales activities.
#J-18808-Ljbffr