Data Engineer, Cross Border Science and Analytics

apartmentAmazon placeNoida calendar_month 
Amazon's Cross Border business is looking for a passionate data engineer who can contribute to build the AI ready data infrastructure to support all analytical and science products for businesses.- 1+ years of data engineering experience
  • Experience with data modeling, warehousing and building ETL pipelines
  • Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
  • Experience with one or more scripting language (e.g., Python, KornShell)- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
  • Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.
Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information.

If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

check_circleNew offer

Data Engineer

apartmentAqusag TechnologiesplaceNoida
Job Description Experience: 3+ years Job Type: Full-time (Contractual / Freelancing) Location: Remote Required Skills: Python, Mysql, Postgresql, Java, Javascript, Ai Job Summary We are seeking an expert Data Engineer to join our team and play...
starFeatured

Lead Data Engineer

apartmentiris software inc.placeNoida
services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation...
business_centerHigh salary

Data Engineer

apartmentAccentureplaceGurgaon, 33 km from Noida
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load...