A development team integrates a generative AI coding assistant that was trained on public repositories. The tool accelerates feature delivery but occasionally references deprecated libraries. Legal warns that AI-generated code may contain license violations or expose proprietary logic if the model was trained on leaked internal code.
What should the security manager do FIRST?
A. Engage legal counsel to review the AI vendor's training data sources and contractual indemnification clauses
B. Implement software composition analysis (SCA) and require all AI-generated code to be digitally signed before commit
C. Restrict the AI tool's access to internal repositories and enforce output review through secure-coding peer validation
D. Retrain or fine-tune the AI model using only vetted, license-compliant code from approved sources