Going by the classical definition, an impurity can be defined as an unwanted chemical substance which differs from the chemical composition of the material in question. In pharmaceutical preparations, impurities identification, assessment and quantification play a vital role in evaluating the quality of the drug substance or drug product being manufactured. To understand the concept of Impurities in Pharmaceutical preparation we will resort to the tried and tested methodology of 5 W and 1 H.

 

What are impurities in Pharmaceutical Preparations?

An impurity in a ‘drug substance/API’ may be defined as an unwanted chemical substance which differs from the chemical composition of the drug substance. It may differ in chemistry, structure, enantiomer etc.

An impurity in a ‘drug product/finished formulation’ may be defined as any component of the drug product that is not the chemical entity defined as the drug substance or an excipient in the drug product.

 

What are the various sources of impurities?

Impurities in drug substance (i.e., active pharmaceutical ingredient, API) or drug product can arise due to synthetic/manufacturing processes, degradation, storage conditions, container, excipients, or contamination. They can be identified or unidentified, volatile or non-volatile, organic or inorganic species.

 

What are the various sources of impurities in Drug Substances?

Various sources of impurities in drug substance are mentioned below:

Organic Impurities: Organic impurities may be drug-related or process-related and consist of identified, specified impurities, unidentified, specified impurities or total unknown impurities.

As explained above, the organic impurities are either Process Related Impurities (PRIs) or are Degradation Related Impurities (DRIs). The DRIs are generated by degradation of API itself under specific storage conditions, e.g., oxidation, dehydration, carbon dioxide removal, etc. The interactions between API and excipients, container, or residual impurities in excipients, reagents, or solvents also leads to generation of DRIs.

The PRIs are found in every API unless proper care is taken in every step involved, throughout the synthesis process. Although the end products are always washed with solvents, however, the possibility of the unreacted residual materials not getting washed away cannot be ruled out.

Solvents: Residual solvents are organic volatile chemicals used during the manufacturing process or generated during the production. Some solvents that are known to cause toxicity should be avoided in the production of bulk drugs.

Inorganic Impurities: Inorganic impurities are usually derived from the manufacturing processes used for bulk drugs. Impurities associated with input raw materials and storage conditions can also contribute to the impurity profile of the drug substance. They are normally known and identified, and include the following:

Reagents, ligands, and catalysts

The chances of having these impurities are rare: however, in some processes, these could create a problem unless the manufacturers take proper care during production.

Heavy metals

The main sources of heavy metals are the water used in the processes and the reactors (if stainless steel reactors are used), where acidification or acid hydrolysis takes place. These impurities of heavy metals can easily be avoided using demineralized water and glass-lined reactors.

Other materials (e.g., filter aids, charcoal etc.)

The filters or filtering aids such as centrifuge bags are routinely used in the bulk drugs manufacturing plants and in many cases, activated carbon is also used. The regular monitoring of fibers and black particles in the bulk drugs is essential to avoid these contaminations.

 

What are the various sources of impurities in Drug Products?

The impurities in drug products may arise from the degradation of drug substance itself, or may be generated during processing, formulating, excipients used, degradation during processing, storage etc.

 

Why is it important to test impurities?

The impurities if present beyond the acceptance criteria may affect the identity, quality, safety and efficacy of the drug substance or a drug product. It is also possible that the impurities may escalate the degradation of the product itself and leads to serious concerns regarding the stability of the product. Hence proper qualitative and quantitative analysis is must.

 

How are impurities classified?

Identified impurity: an impurity for which structural characterisation has been achieved.

Unidentified impurity: an impurity for which a structural characterisation has not been achieved and that is defined solely by qualitative analytical properties (for example, relative retention).

Specified impurity: an impurity that is individually listed and limited with a specific acceptance criterion in a monograph. A specified impurity can be either identified or unidentified.

Unspecified impurity: an impurity that is limited by a general acceptance criterion and not individually listed with its own specific acceptance criterion.

Other detectable impurities: potential impurities with a defined structure that are known to be detected by the tests in a monograph but not known to be normally present above the identification threshold in substances used in medicinal products that have been authorised by the competent authorities of Parties to the Convention. They are unspecified impurities and are thus limited by a general acceptance criterion.

 

What is the acceptance criteria for impurities?

The benchmark above which if the impurities are found in a drug substance or a drug product, this may raise serious questions on the identity, quality, safety and efficacy of the product.

 

How to establish acceptance criteria for impurities?

Acceptance criteria should be set taking into account the qualification (the acquisition and evaluation of data establishing the safety of an impurity) of the degradation products, accelerated and long-term stability data, the expected expiry period and the recommended storage conditions for the drug product.

Allowance should be made for the normal variations in manufacturing, analysis and the stability profile.

Irrespective of the nature of these impurities, limits and acceptance criteria have to be worked out on the basis of factors such as toxicity, process capability, manufacturing practices and so on. The test for Related substances given in many monographs covers manufacturing impurities (intermediates and by-products) and/or degradation products. Specific tests may be supplemented by a more general test controlling other impurities.

Solvents are inorganic or organic liquids used as vehicles for the preparation of solutions or suspensions during the synthesis of a drug substance. Since these are generally of known toxicity, they can be controlled with appropriate limits. In addition to a general limit on solvents remaining behind in the final drug substances, some drugs need specific limits for specific solvents where variation in levels requires control.

In general, drug products have a test for impurities adapted from that in the monograph for the active ingredient with necessary modifications for including degradation products.

 

How impurities shall be categorized while framing specifications?

Impurities shall be categorized as follows:

For Drug Substance

For Drug Product

each identified specified impurity

each specified degradation product
each unidentified impurity

any unspecified degradation product

total impurities

total degradation products

residual solvents

inorganic impurities

What should be the acceptance criterion for impurities?

In case, data on qualification and quantification of impurities is not available or the product monograph is not official in any Pharmacopoeia, the workable criterion as per IP 2018 could be:

For Drug Substance

For Drug Product

Type

Limit Type

Limit

each identified specified impurity

NMT 0.5% each specified degradation product

NMT 1.0%

each unidentified impurity

NMT 0.3% any unspecified degradation product

NMT 0.5%

total impurities NMT 1.0% total degradation products

NMT 2.0%

Higher limits may be set if scientifically justified and it has been determined that the impurities are not toxic. In any case, the specifications should in course of time be refined to include tighter and more specific limits in the light of experience with production batches and a better understanding of the manufacturing process.

 

When should we test drug substances and drug products for the presence of impurities?

The drug substances and drug products shall be tested during and after manufacturing. The testing of impurities shall also be conducted during conduct of stability studies to understand the nature and extent of degradation products being generated. However, based on validation results in-process testing of impurities may be eliminated.

 

How to perform testing of impurities?

The tests as mentioned in any pharmacopeial monograph are tests for purity that provide information on the extent of known potential or actual impurities rather than for guaranteeing freedom from all possible impurities.

Chemical tests that reveal the levels of particular impurities or classes of impurities are often

augmented by physical tests such as specific optical rotation, light absorbance, refractive index etc. Besides, non-specific tests such as sulphated ash, heavy metals, loss on drying etc. contribute to the assurance of the general quality of the article and of the use of GMP in its production, the avoidance of contamination and the removal of volatile solvents.

Various test methods to detect impurities are:

  1. Chromatography: This is the basis of the test for Related substances. The test may be specific or general. A specific test is one where a particular impurity arising from the manufacturing process or from degradation needs to be limited on grounds of toxicity or any other special reason. Where the impurity is known to be particularly toxic, this should be taken into account. Such specific tests include a chromatographic or colorimetric comparison with a sample of the named substance e.g. salicylic acid in aspirin. Both types of tests require the use of Reference Standards. In chromatographic determinations, in the absence of a reference standard it is usual practice to limit the levels of impurities by the simple test of comparison of the unknown spot or peak with a spot or peak obtained with a dilute solution of the substance under examination.
  2. Thin-layer chromatography (TLC): It is quick and is particularly useful in process monitoring and in detecting impurities during the course of manufacture. However, it has its limitations in fixing limits for specific impurities in the final product although for long it was the most widely used for this purpose. Total impurities can be determined by gas chromatographic and liquid chromatographic tests, where the total impurity levels can be obtained by summation of the peak areas (usually in the range 1 to 2 per cent). This procedure is rarely adopted in TLC tests because of the semi-quantitative nature of estimating individual spots and the imprecise nature of expressing results for the total impurities. This drawback can be overcome to an extent by the use of two- and threelevel tests. In the former, in addition to a nominal concentration of the reference solution, another at a lower concentration is used for spotting the plate; in the latter, two more solutions at different lower concentrations are used.
  3. HPLC: In liquid chromatographic tests the relative detector response factor that expresses the sensitivity of a detector relative to a standard substance is an important factor to be considered. As a general thumb rule, if the response factor of an impurity is between 0.8 and 1.2, it may be considered the impurity has a similar response to that of the drug substance. Also, response factors less than 0.2 or more than 5 are not recommended. In such cases, the method needs to be amended to bring the response factor within the acceptable range by either choosing a different wavelength of measurement or a different method of visualisation. Unknown impurities may be limited by reference to a dilution of the substance under examination used as a reference solution.

 

What are reference standards?

The US Pharmacopeia (USP) defines reference standard materials as “highly characterized specimens of drug substances, excipients, reportable impurities, degradation products, compendial reagents, and performance calibrators”.

Scientists performing analytical testing use reference standards to determine quantitative (e.g., assay and impurity) as well as qualitative (e.g., identification tests) data, performance standards, and calibrators (e.g., melting point standards). The quality and purity of reference standards, therefore, are critical for reaching scientifically valid results.

 

Is it mandatory to test for impurities in case monograph of the drug substance or drug product does not specifically direct to do so?

It is not possible to include in each monograph a test for every impurity or contaminant or even an adulterant that might be present. The exclusion of a limit for impurities in a monograph does not absolve the manufacturer of providing assurance to the user on the safety of a drug.

It is incumbent on the manufacturer to follow Good Manufacturing Practices (GMP) and to ensure the limitation of impurities based on knowledge of the properties of the chemical entity and the likelihood of related substances being associated with the end product during production and subsequent storage.

 

What are degradation products?

Degradation products includes following:

  1. degradation products of the active ingredient in the drug product,
  2. reaction products of the active ingredient with the excipient(s),
  3. reaction products of the active ingredient with the immediate container/closure system and
  4. products of interaction between the various drugs in a combination product.

Both identified and unidentified degradation products are included in the acceptance criteria. Identification of such impurities is done from stability studies, forced degradation studies and analysis of routine production batches. Wider limits and/or additional controls may be required for impurities arising during manufacture or on storage of the dosage form.

 

What are enantiomeric impurities?

The single enantiomeric form of a chiral drug is considered as an improved chemical entity that may offer a better pharmacological profile and an increased therapeutic index, with a more favorable adverse reaction profile. However, the pharmacokinetic profiles of levofloxacin (S-isomeric form) and ofloxacin (R-isomeric form) are comparable, suggesting the lack of advantages of a single isomer in this regard. For the manufacturers of a single enantiomeric drug (eutomer), the undesirable stereoisomers in drug control are considered in the same manner as other organic impurities.

 

 References:

  1. ICH Topic Q 6 A, Specifications: Test Procedures and Acceptance Criteria for New Drug Substances and New Drug Products: Chemical Substances
  2. Indian Pharmacopeia
  3. US Pharmacopeia
  4. British Pharmacopeia