Jobs Big Data Specialist DB Managements (DBMS) in Colombia

Estás son las últimas ofertas de trabajo de Big Data Specialist DB Managements (DBMS) in Colombia encontradas.

 job(s)

  • 26/01/2023

    Amazonas, Antioquia, Arauca, Atlántico, Bolívar, Boyacá, Caldas, Caquetá, Casanare, Cauca, Cesar, Chocó, Córdoba, Cundinamarca, Guainía, Guaviare, Huila, La Guajira, Magdalena, Meta, Nariño, Norte de Santander, Putumayo, Quindío, Risaralda, Santander, Sucre, Tolima, Valle del Cauca, Vaupés, Vichada, San Andrés, Providencia y Santa Catalina, Bogotá

    N-iX is a software development service company that helps businesses across the globe develop successful software products. During 20 years on the market and by leveraging the capabilities of Easter Europe talents, the company has grown to 2000+ professionals with a broad portfolio of customers in the area of Fortune 500 companies as well as technological start-ups. With its headquarters in Lviv, Ukraine, the company also has multiple development offices in the East European region and representative entities in the United States of America, Sweden, and Malta. Within the role of the Cloud Data Engineer you will work with a primary focus on delivering data platform footprint, as an expansion of our client's cloud presence, enabling key initiatives such as our Digital Banking. You will be working with several teams in the organization in a collaborative manner, helping deliver data related components by leveraging automation to deliver self-service components as much as possible. You will be part of a team that drives the implementation of data components such as pipelines, applications, sanitation, supporting our data scientists and working with the rest of the product teams while collaborating with the architecture team, influencing the design and delivery of our data footprint and striving for greater functionality in our data systems. Rol: Data Engineer Requirements: At least 5 or more years of experience working in the data engineering field: Proficiency in SQL language. Strong knowledge of Python language. Experience in optimization of high volume ETL processes. Experience with any of the popular Clouds (GCP, AWS, Azure). Good knowledge in Message Broker systems (e.g., Kafka, PubSub). Google GCP data platform (Dataflow, Dataprep, Cloud Composer, BigIGQuery, CloudSQL) knowledge and experience is an asset, or knowledge of the equivalent open source toolset behind those products. Upper-intermediate level of English. Nice to have: Good knowledge of popular data standards and formats (e.g, JSON, XML, Proto, Parquet, Avro, ORC, etc). Data modelling skills. GCP Data Engineering Certification preferred. Responsibilities: Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Optimally extract, transform, and load data from a wide variety of data sources using SQL and Google Cloud data technologies Collaborate with the team to decide on which tools and strategies to use within specific data integration scenarios Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs Develop and maintain code and documentation for ETL and other data integration projects and procedures Monitor and anticipate trends in data engineering, and propose changes in alignment with organizational goals and needs Share knowledge with other teams on various data engineering or project related topics We offer: Flexible working format - remote, office-based or flexible. A competitive salary and good compensation package. Flexible and personalized career growth. Professional development tools (mentorship program, tech talks and training, centers of excellence, and more). Active tech communities with regular knowledge sharing. Education reimbursement. Paid vacation days, sick leaves, and days off. Healthcare & Sport program. Medical insurance. Memorable anniversary presents. Corporate events and team buildings. Labor Conditions: Workplace: Colombia - Remote. Type of Contract: indefinitely. Salary: to be agreed according to experience. This vacancy is disclosed through ticjob.co

  • 30/12/2022

    Amazonas, Antioquia, Arauca, Atlántico, Bolívar, Boyacá, Caldas, Caquetá, Casanare, Cauca, Cesar, Chocó, Córdoba, Cundinamarca, Guainía, Guaviare, Huila, La Guajira, Magdalena, Meta, Nariño, Norte de Santander, Putumayo, Quindío, Risaralda, Santander, Sucre, Tolima, Valle del Cauca, Vaupés, Vichada, San Andrés, Providencia y Santa Catalina, Bogotá

    ¿Cuentas con experiencia Desarrollando ETL's con GCP? ¿Qué esperas para aplicar? Somos una de las empresas más importantes a nivel de tecnología y trabajamos de la mano con empresas de alto valor. Rol: Ingeniero de Datos GCP - Google Cloud Platform Requisitos: Técnico, Tecnólogo o Profesional en Ingeniería de Sistemas, electrónica o carreras afines. Experiencia mínima de dos (2) años como ingeniero de Datos con Google Cloud Platform. Indispensable manejo de GCP y SQL. Contamos con grandes beneficios a nivel profesional como: Plan carrera y proceso de formación continua con la compañía. Udemy. Trabajo Remoto. Medicina Prepagada. Grandes posibilidades de desarrollo profesional. Otros. Número de Vacantes: 3 Condiciones Laborales: Salario: A convenir de acuerdo a la experiencia o perfil profesional. Tipo de Contrato: A término indefinido Modalidad de Trabajo: Colombia - Remoto. Esta vacante es divulgada a través de ticjob.co

Detailed Job Search