launcher-config
Configure NeMo AutoModel job launches for interactive runs, Slurm clusters, and SkyPilot cloud execution.
Guide for selecting and configuring distributed training strategies in NeMo AutoModel, including FSDP2, Megatron FSDP, DDP, and parallelism settings.
This listing is imported from SkillsMP metadata and should be treated as untrusted until upstream source review is completed.
Install skill "distributed-training" with this command: npx skills add NVIDIA-NeMo/skillsmp-nvidia-nemo-nvidia-nemo-distributed-training
This source entry does not include full markdown content beyond metadata.
This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.
Related by shared tags or category signals.
Configure NeMo AutoModel job launches for interactive runs, Slurm clusters, and SkyPilot cloud execution.
Guide for onboarding new model families into NeMo AutoModel, including architecture discovery, implementation patterns, registration, and validation.
Verify numerical parity between NeMo AutoModel implementations and reference HuggingFace models, including state dict and forward-pass checks.
Environment setup and day-to-day development workflow for NeMo AutoModel, including installation, tooling, and code quality commands.