CVs for AI roles: how to present open-source contributions versus proprietary work
CV tipsAIportfolio

CVs for AI roles: how to present open-source contributions versus proprietary work

jjoblondon
2026-01-30 12:00:00
9 min read
Advertisement

Practical steps to display open-source work and redacted proprietary AI projects on your CV, with templates and 2026 hiring trends.

Hook: stuck between public GitHub stars and a strict NDA?

If you build AI models, you face a common London jobmarket dilemma in 2026: recruiters expect visible proof of technical impact, but much of your best work sits behind corporate walls. Whether you're a student trying to stand out, a mid-career machine learning engineer, or an academic moving into industry, this guide gives clear, practical steps to present open-source contributions and proprietary work on your CV and portfolio — while respecting confidentiality and making your technical impact obvious.

Hiring signals shifted dramatically across late 2025 and into 2026. Key trends to know:

  • Open-source prominence: More companies value OSS experience — not just code commits but model releases, reproducibility artifacts, and community leadership.
  • Regulatory scrutiny: The enforcement phase of the EU AI Act and updated UK ICO guidance (2025–26) make provenance, model cards and data lineage more valuable in interviews.
  • Automated vetting: Recruiters increasingly use automated GitHub analyses and code-scoring tools; high-quality READMEs, tests and notebooks now help more than raw commit count.
  • Proven impact over raw code: Employers want clear business or scientific outcomes (latency, accuracy lift, cost reduction), even if they can't see the code.

These trends mean you should treat your CV and portfolio as both a technical record and a governance-friendly dossier.

Principles to follow: how to balance visibility with confidentiality

  • Be specific, not revealing: Quantify outcomes and describe architectures at a high level without disclosing IP.
  • Prefer artifacts you own: Link to public repos, model cards, demo spaces, or anonymised notebooks you control.
  • Use governed disclosure: When discussing proprietary work, use approved language and, when possible, get written permission for redacted examples.
  • Prove reproducibility: For open-source work, include scripts, CI badges, and sample data pipelines so recruiters and engineers can validate your claims fast.

Showcasing open-source contributions — what to include on CV and portfolio

Open-source contributions are the clearest evidence of technical skill. But recruiters judge quality, not quantity. Use this checklist to make OSS work shine.

CV bullets: the formula

Use: Situation + action + result + proof link. Keep each line concise and metric-led.

  • Example: "Led implementation of distillation pipeline for transformer model in project-name, reducing inference latency by 35% and cutting costs by £12k/month — code & demo: github.com/you/project-name."
  • Example: "Maintainer & release manager for lib-vision (v2.0): introduced robust tests and CI leading to 3x contributor growth; 400+ weekly downloads."

Portfolio elements that recruit and convert

  • Repository landing page: Clear README, one-line value proposition, quickstart, license, and model card if relevant.
  • Reproducible demo: Hugging Face Space, Colab notebook, or Docker image. Recruiters love a one-click demo.
  • Model cards and datasheets: Short document with scope, limitations, and intended uses — this signals governance awareness; include a provenance note (see: provenance best practices).
  • Tests & CI: Passing unit tests and CI badges prove engineering discipline; surface any automation from your training and CI pipelines.
  • Contribution highlights: Short section listing important PRs, issues you resolved, and community leadership (e.g., triaging, releases).

Showcasing proprietary work without breaching confidentiality

Proprietary work is often the most technically sophisticated, but it’s also sensitive. If you’re under an NDA or company policy, follow this five-step approach.

1. Use impact-first bullets

Start with metrics and outcomes, then list high-level methods. Avoid naming internal datasets, model names or exact architectures if forbidden.

  • Good: "Improved model F1 from 0.68 to 0.82 on a large-scale entity extraction task, enabling 40% faster downstream processing (proprietary data; code withheld)."
  • Poor: "Worked on internal dataset 'ProjectPhoenix' and released code for model X" (reveals project names and implies release).

2. Provide anonymised or redacted artifacts

Ask your employer for permission to publish redacted examples. If allowed, provide:

  • Sanitised notebooks with synthetic data that mirror the pipeline.
  • Architecture diagrams that show component interactions without proprietary labels.
  • Benchmark tables with anonymised dataset names (e.g., "internal e-comm dataset").

3. Use non-disclosure-friendly phrasing

When you can’t provide artifacts, use structured phrasing that conveys expertise without details.

  • Template: "Built and productionised a [type of model] for [domain] — achieved [metric improvement], reduced cost/latency by [X]% (code and data proprietary)."
  • Template with redaction: "Developed a privacy-preserving feature engineering pipeline (details redacted) that improved model robustness under domain shift."

4. Get written permission where possible

Simple written permission from governance or your manager makes a huge difference. Provide a one-page summary they can approve — it speeds hiring checks and avoids future disputes. Consider including a short governance playbook style summary to accelerate approvals.

5. Show governance competency

Employers increasingly ask about model governance. Mention your role in:

What to never publish: a confidentiality quicklist

  • Raw datasets containing PII, financials, or healthcare records.
  • Company-owned model weights or internal API keys.
  • Internal system logs, trace IDs, or infra details that expose security postures (patch management is often reviewed during hiring checks).
  • Unapproved screenshots of internal dashboards or product roadmaps.

Practical CV snippets: before & after

Show — don’t just tell. Below are short CV lines for open-source and proprietary work, followed by improved versions that recruiters prefer.

Open-source example

Before (weak): "Contributed to open-source ML projects on GitHub."

After (strong): "Implemented low-precision quantisation in open-quant — reduced model size by 60% and inference latency by 25%; accepted PR #142; demo: github.com/you/open-quant (maintainer endorsement)."

Proprietary example

Before (weak): "Worked on recommender system at LargeCo."

After (strong): "Led model optimisation for LargeCo’s B2C recommender, increasing click-through by 18% and lowering inference cost by 28% using A/B-tested candidate ranking (proprietary system; redacted demo available on request)."

Portfolio layout: a practical template

Use a single-page portfolio or README that follows this order so recruiters see proof quickly:

  1. Hero summary: One sentence: role, key strengths, and chosen contacts/links.
  2. Top 3 projects: For each project show a one-line impact, one screenshot or badge, and 2–3 bullet technical highlights. Link to code/demo if public.
  3. Redacted proprietary work: One redacted case study with metrics, high-level architecture diagram, and contact for verification (manager or legal if allowed).
  4. Governance & reproducibility: Model cards, tests, CI, and an outline of your SDLC for ML (data lineage, monitoring, rollback policy).
  5. Skills & toolbelt: Short list: PyTorch/TensorFlow, Hugging Face, MLflow, W&B, Docker, Kubernetes, Python, SQL, etc.
  6. Contact & verification: Link to LinkedIn, GitHub, and a short note about reference checks and permissioned artefacts.

Interview-ready artefacts and talking points

Interviews are where redacted work becomes convincing. Prepare these items:

  • Synthetic mini-reproduction: Notebook that mimics the pipeline using public or synthetic data. Walk interviewers through stepping through it.
  • Performance charts: Pre/post graphs (accuracy, latency, cost) with anonymised axes and clear captions.
  • Architecture sketch: One-slide architecture with component names replaced by functions (e.g., "Feature store", "Ranking model").
  • Governance narrative: One paragraph about GDPR/AI-act considerations you addressed and what automated tests you added to guard regressions. If you need reference templates for governance language, look for policy and onboarding playbooks that speed reviews (example).

Case studies: two short, realistic scenarios

Case 1 — Alice, open-source contributor

Alice is a PhD student in London who maintains an open-source dataset cleaning library. She:

  • Includes a model card and benchmarks on her GitHub repo, with a simple Colab demo.
  • Highlights 2 accepted PRs where she fixed critical bugs and notes the number of downstream projects using the library.
  • On her CV, she lists: "Maintainer, cleanlib — improved dataset throughput 4x, enabling faster model training across 25 projects; demo: link".

Result: recruiters now contact her directly for entry-level ML engineer roles and internships.

Case 2 — Ben, proprietary ML engineer

Ben works at a fintech with an NDA. He:

  • Obtains written permission to publish a redacted architecture and a synthetic notebook that reproduces the core pipeline with public data.
  • On his CV: "Reduced fraud detection false positives by 32% while maintaining recall via feature engineering and ensemble stacking (production system; redacted assets available on request)."
  • During interviews he shows a synthetic notebook and a one-page risk assessment he authored for model governance.

Result: Ben receives offers where hiring teams validate his impact through technical conversations rather than code review alone.

Always prioritise legal compliance and ethics. Key reminders:

  • Follow your employment contract and NDA. When in doubt, ask legal or your manager for written permission.
  • Never fabricate or exaggerate results. Recruiters sometimes request proof during interviews or references; be prepared to show reproducible demos or redacted reproductions built with synthetic data (offline-first patterns are useful for shipping safe examples).
  • When publishing open-source derivatives of proprietary systems, ensure no licensed code or data is included.
Tip: A short email to your manager with 3 bullet examples of what you'd like to publish often gets quick approval — most companies prefer controlled disclosure to surprises.

Checklist: what to do this week to improve your AI CV

  1. Audit your CV: turn one generic line into a metric-led impact statement.
  2. Pick one open-source project and add a short demo (Colab or Hugging Face Space).
  3. Prepare one synthetic notebook that replicates a proprietary pipeline step.
  4. Draft a one-page redacted case study and seek manager approval.
  5. Add a model card or short governance note to your portfolio for every model you mention.

Advanced strategies for senior candidates (2026)

If you're hiring manager level or senior staff, emphasise leadership in open-source governance, reproducibility standards, and deployment reliability:

  • Show adoption metrics: companies/teams using your library or internal playbook.
  • Publish a governance playbook or template you used to pass audits — redact proprietary examples. (See related governance templates like partner onboarding playbooks.)
  • List cross-functional outcomes: reduced incident frequency, improved MTTR, or compliance milestones reached.

Final practical templates you can paste into your CV

Open-source bullet:

Implemented short: "Led development of project-name, adding quantisation and pruning pipelines that reduced inference cost by 45% and are used by 15 downstream projects — code & demo: [link]."

Proprietary bullet (NDA-friendly):

Confidential-friendly: "Designed and productionised ranking model for B2C platform; improved conversion rate by 12% and reduced serving latency by 30% (system and code proprietary; redacted reproduction available on request)."

Wrap-up: present impact, protect IP, get hired

In 2026, the best AI CVs combine visible open-source proof with professionally packaged descriptions of proprietary impact. Recruiters want measurable outcomes, reproducibility signals, and governance awareness. Use the templates, checklists and case studies above to make your technical résumé persuasive without breaching confidentiality.

Next steps: Pick one open-source project to demo, and draft a one-page redacted case study for a proprietary project. Share both in your portfolio and prepare the synthetic notebook before your next interview.

Call to action

Need a quick CV review that protects your IP and highlights impact? Send your CV and two project links to our careers team for a free 10‑minute audit tailored to London employers and the 2026 AI hiring market. Click the contact button on this page to book your slot.

Advertisement

Related Topics

#CV tips#AI#portfolio
j

joblondon

Contributor

Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.

Advertisement
2026-01-24T04:24:54.174Z