We use cookies to provide the essential functionality of the website and its advanced features, to analyse traffic and to improve our services to you, and to provide personalised content to you.
By clicking ‘Accept All’ you agree to these. You can change your selection at any time under the ‘Cookie Settings’ link on the page.
You can read more about our cookies in our Cookie Policy
Cookie Settings
We use cookies to provide the essential functionality of the website and its advanced features, to analyse traffic and to improve our services to you, and to provide personalised content to you.
By clicking ‘Accept’ you agree to these. You can read more about our cookies in our Cookie Policy
These cookies are essential to the functionality of thebigjobsite.com
When you log in to the Internet Site the Company will set a cookie containing a randomly generated unique reference number. This anonymous number allows the Company to identify you. The Company will never store your personal information directly as a cookie. A persistent cookie will be set, persistent cookies are not deleted when you close your browser, and will allow the Internet Site to recognise you on your next visit.
Name
Expiration
Description
ATTBCookie*
2 years
These cookies are used to remember a user’s choice about cookies on thebigjobsite.com. Where users have previously indicated a preference, that user’s preference will be stored in these cookies.
last-search
search
redirect-stage
original-keyword
1 day
Session
1 hour
1 hour
These cookies are used by thebigjobsite.com to pass search data between our own pages.
datadome
1 year
DataDome is a cybersecurity solution to detect bot activity
jjap
1 days
Used to track if you have seen the Job Alerts prompt. Job Alerts is a service you can subscribe to to receive information about new jobs.
Advanced features of the site use Cookies to provide information you requested and to reduce you having to key in repeated fields.
Name
Expiration
Description
attb-loc
3 months
Stores your location information so that we can pre-populate search fields to find jobs near you.
Analytic cookies allow the Company to see how the Internet Site is being used. This information forms the basis of future development work, and so enables the Company to continually improve its Internet Site to best suit its users.
Name
Expiration
Description
__gads
_ga
_ga_JH3TWMTYRK
_gat_gtag_UA_1462011_9
_gcl_au
_gid
_uetsid
_uetvid
13 months
2 years
2 years
1 minute
90 days
24 hours
1 day
16 days
Google Analytics: For purposes of analytics, your UserID may be tracked and sent to Google Analytics after you register for one of our services such as Job Alerts. After you register, your registered session may be stitched together with your original, unauthenticated session. This allows longer-term tracking to help us monitor the effectiveness of our marketing campaigns. No personally identifiable information, or data that permanently identifies your device, is sent along with your tracking IDs.
Role Profile: Development under agile methodologies,Candidates must therefore be able to work collaboratively, demonstrate good ownership and be able to work well in teams. Work will include designing, enhancing, and developing MongoDB databases and Kafka Clusters, and may occasionally require working across different database environments.The job
Bounteous x Accolite makes the future faster for the world's most ambitious brands. Our services span Strategy, Analytics, Digital Engineering, Cloud, Data & AI, Experience Design, and Marketing. We are guided by Co-Innovation, our proven methodology of collaborative partnership.Bounteous x Accolite brings together 5000+ employees spanning North Am
Position: Data Engineer (MONGODB and Kafka)Location: Montreal, CA (Office attendance day 1 – Hybrid Mode, 3x per week)Hybrid role: 3days days on siteDuration: Long TermRole Profile:• Development under agile methodologies,• Candidates must therefore be able to work collaboratively, demonstrate good ownership and be able to work well in teams. Work w
Location: Montreal, Quebec
Our client works to identify and solve the most complex and highest value business problems that can be addressed through data science techniques. To achieve this, they provide data science, operations research and artificial intelligence solutions and software products to a broad range of industry and technology partners
DRW is a diversified trading firm with over 3 decades of experience bringing sophisticated technology and exceptional people together to operate in markets around the world. We value autonomy and the ability to quickly pivot to capture opportunities, so we operate using our own capital and trading at our own risk.
Headquartered in Chicago with of
Bounteous x Accolite makes the future faster for the world's most ambitious brands. Our services span Strategy, Analytics, Digital Engineering, Cloud, Data & AI, Experience Design, and Marketing. We are guided by Co-Innovation, our proven methodology of collaborative partnership.Bounteous x Accolite brings together 5000+ employees spanning North Am
Job Tile: Data EngineerLocation: Montreal, CANADADuration: Full-time roleWith a startup spirit and 115,000+ curious and courageous minds, we have the expertise to go deep with the world’s biggest brands—and we have fun doing it. We dream in digital, dare in reality, and reinvent the ways companies work to make an impact far bigger than just our bot
We're seeking a skilled Data Engineer with expertise in Google Cloud Platform (GCP) to join our DataOps team.
As our first ever Data Engineer , you'll be instrumental in designing, optimizing, and maintaining data pipelines on GCP. If you are looking for a role with a wide scope, in a small team, in which you actually get a say regarding the futu
Job Source: Confidential
Data Engineer
Montreal, QC, Canada
Nous cherchons un ingénieur de données au sein de l’équipe MongoDB et Kafka. Nous sommes un groupe hautement technique qui réalise en parallèle plusieurs projets pour plusieurs domaines d’activité. Les propriétaires d’entreprise et les experts en la matière sont répartis dans le monde entier, ce qui rend de solides compétences en communication importantes pour le poste. Le/la candidat(e) devra travailler en étroite collaboration avec nos partenaires TI afin d’analyser et de répondre aux exigences opérationnelles.Profil du rôle :· Développement sous méthodologies agiles,· Les candidats doivent donc être capables de travailler en collaboration, de faire preuve d’une bonne appropriation et d’être capables de bien travailler en équipe. Le travail comprendra la conception, l’amélioration et le développement de bases de données MongoDB et de clusters Kafka, et peut parfois nécessiter de travailler dans différents environnements de base de données.· Le poste impliquera la prise en compte de tous les aspects du cycle de vie du projet et comprend les évaluations de preuve de concept, le codage, la conception, les tests, la mise en œuvre, le déploiement et le soutien continu des versions du projet ainsi que le soutien de niveau 2 sur appel.Rôle et responsabilités principaux :· Ingénierie de l’infrastructure pour la stabilité, l’évolutivité et la planification de la capacité pour certaines des plus grandes configurations du secteur financier· Fonctions DBA : planification et exécution des modifications du schéma de la base de données, analyse des goulots d’étranglement, prévention/résolution des pannes, amélioration des performances du serveur/de la base de données, maintenance de la base de données, récupération de la base de données.· développement d’outils pour KAFKA et MongoDB· Appliquer la sécurité, les normes et les directives des bases de données.Compétences requises :· 3-5 ans d’expérience de Python· 3 à 5 ans d’expérience avec MongoDb et/ou KAFKA· Une connaissance de base d’autres bases de données est un plus· Des connaissances de base en agilité et Devops sont un plus· Les outils de gestion des versions tels que git sont un plus· Appliquer la sécurité, les normes et les directives des bases de données.Compétences personnelles :· Intégrité et propriété, bon esprit d’équipe· Capacité à travailler dans des conditions de dépendance et de contraintes de temps et de ressources· Capacité à trouver des solutions simples et efficaces· Degré élevé de motivation pour élargir les connaissances techniques et commercialesThis position is for a data engineer role in the MongoDB and Kafka squad. We are a highly technical group delivering on multiple projects for multiple business areas in parallel. The business owners and subject matter experts are globally distributed, making strong communication skills important to the position. The candidate will be expected to work closely with our IT partners in analyzing and delivering on business requirements. Role Profile: · Development under agile methodologies,· Candidates must therefore be able to work collaboratively, demonstrate good ownership and be able to work well in teams. Work will include designing, enhancing, and developing MongoDB databases and Kafka Clusters, and may occasionally require working across different database environments.· The job will involve considering all aspects of the project life cycle and includes proof-of-concept evaluations, coding, designing, testing, implementing, deploying, and continued support of project releases as well as on-call Level 2 support. Major Role & Responsibilities: · Infrastructure Engineering for stability, scalability and capacity planning for some of the largest configurations on the financial industry· DBA functions: plan & execute database schema changes, bottleneck analysis, outage prevention/resolution, server/database performance improvements, database maintenance, database recovery.· development of tools for KAFKA and MongoDB· Enforce database security, standards & guidelines. QUALIFICATIONS Required Skills: · 3-5 years of experience of Python· 3-5 years of experience with MongoDb and/or KAFKA· Basic knowledge of other databases is a plus· Basic knowledge of agile and Devops is a plus· Versioning tools such as git is a plus Personal skills: · Integrity & ownership, good team player· Ability to work under time and resource dependencies and constraints· Ability to find simple and effective solutions· High degree of motivation to expand technical and business knowledge