Description
In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.
Published: 2023-04-05
Score: 9.8 Critical
EPSS: 3.8% Low
KEV: No
Impact: n/a
Action: n/a
AI Analysis

Analysis and contextual insights are available on OpenCVE Cloud.

Remediation

No vendor fix or workaround currently provided.

Additional remediation guidance may be available on OpenCVE Cloud.

Tracking

Sign in to view the affected projects.

Advisories
Source ID Title
EUVD EUVD EUVD-2023-0118 In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.
Github GHSA Github GHSA GHSA-fprp-p869-w6q2 LangChain vulnerable to code injection
History

Wed, 12 Feb 2025 17:15:00 +0000

Type Values Removed Values Added
Metrics ssvc

{'options': {'Automatable': 'yes', 'Exploitation': 'none', 'Technical Impact': 'total'}, 'version': '2.0.3'}


Subscriptions

Langchain Langchain
cve-icon MITRE

Status: PUBLISHED

Assigner: mitre

Published:

Updated: 2025-02-12T16:24:39.291Z

Reserved: 2023-04-05T00:00:00.000Z

Link: CVE-2023-29374

cve-icon Vulnrichment

Updated: 2024-08-02T14:07:45.736Z

cve-icon NVD

Status : Modified

Published: 2023-04-05T02:15:37.340

Modified: 2025-02-12T17:15:18.260

Link: CVE-2023-29374

cve-icon Redhat

No data.

cve-icon OpenCVE Enrichment

No data.

Weaknesses