
Leading supplier of end-to-end high speed Ethernet and InfiniBand intelligent interconnect solutions and services.

Leading supplier of end-to-end high speed Ethernet and InfiniBand intelligent interconnect solutions and services.
Founded: 1993
Headcount (approx.): 42,295
Core focus: GPUs, AI computing platforms, systems, and software
Notable software: CUDA, Omniverse
High-performance computing, AI infrastructure, graphics rendering, networking for data centers, and industrial/autonomous systems.
1993
Semiconductors / AI compute / Software platforms
$2 billion
Investment in CoreWeave to expand AI compute capacity.
$5 billion
Purchase of an equity stake in Intel as part of a collaboration.
$500 million - $1 billion (reported)
Reported investment in Poolside as part of a larger funding round.
NVIDIA is seeking a Senior Machine Learning software engineer to discover and innovate new low-precision and sparsity recipes in the pretraining setting. We are a team committed to developing next-generation software to make use of novel hardware features on current GPUs. We also provide guidance for design of next-gen GPU features. The job scope spans recipe design for all phases of the LLM life cycle: pre training, post training, and generation. Making these recipes generic and accurate is critical for adoption. Your work will be a component of our SW productization story in libraries like Megatron-LM, Transformer Engine, cuDNN, cuBLAS, etc. * Keep abreast on quantized LLM training research * Build robust and reproducible training recipes * Collaborate closely with hardware, software, and research teams to assess and adopt deep learning algorithmic advancements in quantization * Work with production SW teams to realize recipes in production workflows * PhD, M.S. degree or equivalent experience in Computer Science or a related field, and 5+ years of relevant software engineering experience. * Proficient in Python * Experience with PyTorch or similar framework * Solid foundation in LLM pre training, post training, or generation * Proficient in the math of machine learning * Strong written and oral communication skills * Proficient in precision and numerics for ML * Familiarity with FP8 and MX formats for training * Strong programming skills and ability to debug ML systems The GPU started out as the engine for simulating human imagination, conjuring up the amazing virtual worlds of video games and Hollywood films. Now, NVIDIA’s GPU runs deep learning algorithms, simulating human intelligence, and acts as the brain of computers, robots and self-driving cars that can perceive and understand the world. Just as human imagination and intelligence are linked, computer graphics and artificial intelligence come together in our architecture. Today, NVIDIA GPUs are used broadly for deep learning, and NVIDIA is increasingly known as “the AI computing company.” Widely considered to be one of the technology world’s most desirable employers, NVIDIA has some of the most forward-thinking and hardworking people in the world inventing the future for us. Are you a creative and collaborative software engineer seeking new challenges? If so, we want to hear from you! Come, join us and help build the real-time, cost-effective AI computing platform driving our success in this exciting and quickly growing field. The base salary range is 184,000 USD - 287,500 USD. Your base salary will be determined based on your location, experience, and the pay of employees in similar positions. You will also be eligible for equity and benefits . NVIDIA is committed to fostering a diverse work environment and proud to be an equal opportunity employer. As we highly value diversity in our current and future employees, we do not discriminate (including in our hiring and promotion practices) on the basis of race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law. JR1999923