Skip to content

Zortech Solutions – Sr. Data Architect with GCP-Canada – Toronto, ON

Company: Zortech Solutions

Location: Toronto, ON

Expected salary:

Job date: Wed, 09 Apr 2025 22:18:48 GMT

Job description: Role: Sr. Data Architect with GCPLocation: Remote-CanadaDuration: 6-12+ MonthsJob Description :Key Responsibilities:Join our quest at client as a Data Architect, where you’ll be at the heart of innovation, shaping our trailblazing Generative AI platform. Our vision? To revolutionize the Gen-AI landscape, making it as universal and user-friendly as possible, touching lives from seasoned developers to creative designers and everyone in between.Our mantra is simple yet profound: ‘Focus, Flow, and Joy’. If you have a fervent interest in crafting innovative products for a broad audience and are excited about leveraging state-of-the-art technology, this is the right role for you.Imagine being part of our Data Platform Engineering team, a place where status quo are questioned, processes are perfected, and cutting-edge tech is used not just for the sake of it, but to fundamentally transform effort into efficiency, and ideas into reality. This isn’t just a job; it’s a journey to redefine the future of technology.What you’ll doYou will be part of a dynamic team with experienced professionals, dedicated to delivering comprehensive solutions that harness the power of Data and Generative AI technology, including the development of custom-built products.By joining our community of passionate builders, you will contribute to our shared goal of providing the most valuable, user-friendly, and enjoyable experiences! You will be playing a key role in ensuring the quality and rapid delivery of the products built using the client Generative AI platform.We enjoy:Exploring bleeding-edge technologies, tools and frameworks to experiment with and build better products for existing customersEvaluating areas of improvement with technical products built and implementing ideas which will make us better than yesterdayCollaborating with developers to work on technical designs and develop code, configurations, and scripts to enhance the development lifecycle and integrate systemsCollaborate proactively and respectfully with our team and customersDevelop tools and integrations to support other developers in building productsTake solutions from concept to production by writing code, configurations, and scriptsImprove existing platforms or implement new features for any of our productsCreate comprehensive documentation for implemented solutions, including implementation details and usage instructionsPromote our culture of focus, flow, and joy to gain developers’ support for our solutionsQualificationsWhat you bringBuild Data pipelines required for optimal extraction, anonymization, and transformation of data from a wide variety of data sources using SQL, NoSQL and AWS ‘big data’ technologies.StreamingBatchWork with stakeholders including the Product Owners, Developers and Data scientists to assistwith data-related technical issues and support their data infrastructure needs.Ensure that data is secure and separated following corporate compliance and data governancepoliciesTake ownership of existing ETL scripts, maintain and rewrite them in modern datatransformation tools whenever needed.Being an automation advocate for data transformation, cleaning and reporting tools.You are proficient in developing software from idea to productionYou can write automated test suites for your preferred languageYou have frontend development experience with frameworks such as React.js/AngularYou have backend development experience building and integrating with REST APIs and Databases using languages such as Java Spring, JavaScript on Node.js, Flask on PythonYou have experience with cloud-native technologies, such as Cloud Composer, Dataflow, Dataproc, BigQuery, GKE, Cloud run, Docker, Kubernetes, and TerraformYou have used cloud platforms such as Google Cloud/AWS for application hostingYou have used and understand CI/CD best practices with tools such as GitHub Actions, GCP Cloud BuildYou have experience with YAML and JSON for configurationYou are up-to-date on the latest trends in AI TechnologyGreat-to-haves:3+ years of experience as a data or software architect3+ years of experience in SQL and Python2+ years of experience with ELT/ETL platforms (Airflow, DBT, Apache Beam, PySpark, Airbyte)2+ years of experience with BI reporting tools (Looker, Metabase, Quicksight, PowerBI, Tableau)Extensive knowledge of the Google Cloud Platform, specifically the Google Kubernetes EngineExperience with GCP cloud data related services ( Dataflow, GCS, Datastream, Data Fusion, Data Application, BigQuery, Data Flow, Data Proc, Dataplex, PubSub, CloudSQL, BigTable)Experience in health industry an assetExpertise in Python, JavaInterest in PaLM, LLM usage and LLMOpsFamiliarity with LangFuse or Backstage plugins or GitHub ActionsStrong experience with GitHub beyond source controlFamiliarity with monitoring, alerts, and logging solutionsJoin us on this exciting journey to make Generative AI accessible to all and create a positive impact with technology#L!-CEIPAL

No comment yet, add your voice below!


Add a Comment

Your email address will not be published. Required fields are marked *