ISSN: 2455-5282
Global Journal of Medical and Clinical Case Reports
Research Article       Open Access      Peer-Reviewed

Mathematics and its Applications in Forensics: Case Studies and Analytical Approaches

Bilge Gecioglu*

Department of Forensic Medicine, Institute of Forensic Sciences, Graduate School of Health Sciences, Ankara University, Ankara, Turkey

*Corresponding author: Bilge Gecioglu, Department of Forensic Medicine, Institute of Forensic Sciences, Graduate School of Health Sciences, Ankara University, Ankara, Turkey, Email: bgecioglu@ankara.edu.tr
Received: 16 July, 2025 |Accepted: 25 July, 2025 | Published: 26 July, 2025

Cite this as

Gecioglu B. Mathematics and its Applications in Forensics: Case Studies and Analytical Approaches. Glob J Medical Clin Case Rep. 2025:12(7):170-176. Available from: 10.17352/2455-5282.000221

Copyright License

© 2025 Gecioglu B. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Aim: This paper aims to critically review the role of mathematics in forensic science, emphasizing how quantitative methods enhance the analysis, interpretation, and validation of forensic evidence of forensic evidence. It explores statistical, geometrical, and computational approaches used across diverse forensic disciplines and illustrates these through landmark case studies.

Methodology: A comprehensive literature review was conducted focusing on forensic applications of mathematical techniques such as statistical modeling, Bayesian inference, geometric morphometrics, and machine learning. Selected case studies— including DNA profiling in the Colin Pitchfork case and bite mark analysis in the Ted Bundy investigation—were analyzed to demonstrate practical applications. It also examines emerging computational methods and their integration into forensic workflows were also examined.

Results: Mathematical methodologies provide objective frameworks that increase the reliability and reproducibility of forensic analyses. Statistical models quantify evidentiary strength via likelihood ratios and probabilistic reasoning. Geometric and pattern analysis facilitate accurate biometric comparisons, while machine learning enhances automated evidence classification. Case studies confirm that mathematical rigor significantly contributes to successful forensic investigations and judicial outcomes.

Conclusion: The integration of advanced mathematical tools is indispensable for modern forensic science, improving the precision and transparency of evidence evaluation. Continued interdisciplinary collaboration and methodological innovation are essential to address current challenges and fully leverage mathematical techniques in forensic practice.

Introduction

Forensic science is traditionally perceived as a practical discipline, grounded in physical evidence and laboratory techniques. The increasing complexity of modern forensic investigations necessitates the use of mathematical reasoning for objective, repeatable, and evidence-based outcomes [1,2]. Mathematics plays a critical role in making sense of physical evidence, estimating probabilities, and validating forensic claims [3,4]. Traditional forensic sciences are increasingly complemented by algorithmic and quantitative methods, helping ensure consistency and reliability in legal proceedings [5]. Whether estimating the likelihood of a DNA match, reconstructing a trajectory from blood spatter, or identifying financial fraud patterns, mathematics provides the essential toolkit for drawing conclusions from incomplete or ambiguous data. As the legal system increasingly incorporates scientific testimony, the reliability of that testimony depends largely on the validity and transparency of its quantitative underpinnings.

This paper investigates how mathematics is used in forensic science, evaluates its strengths and limitations, and highlights opportunities for reform and improvement. We organize the discussion by domains of application, review key mathematical methods, and incorporate lessons from real-world legal cases to illustrate both success and misuse.

Methodology

A comprehensive literature review was conducted focusing on forensic applications of mathematical techniques. Thirty-eight forensic cases across nine domains, demonstrating how mathematical frameworks—such as probability theory, Bayesian reasoning, time-series analysis, and spatial statistics, machine learning—directly influence forensic interpretations and judicial outcomes. Selected case studies were analyzed to demonstrate practical applications. Emerging computational methods and their integration into forensic workflows were also examined.

Results

Core mathematical applications in forensic science

Quantitative methods play a critical role in forensic science, offering rigorous frameworks for interpreting evidence and reducing ambiguity. This section provides a detailed overview of five core mathematical and statistical approaches—Bayesian inference, geometric modeling, machine learning, graph theory, and differential equations—each integral to contemporary forensic practice.

1. Statistical modeling and Bayesian inference: Bayesian inference offers a structured way to evaluate the probability of competing hypotheses in light of observed evidence. Its strength lies in explicitly modeling uncertainty—a crucial component when presenting evidence in courtrooms [4,6]. The fundamental equation of Bayes’ theorem allows forensic scientists to compute the posterior probability of a hypothesis (e.g., the suspect’s guilt) by combining the likelihood of the evidence with prior knowledge:

P(H|E)= P(E|H)P(H) P(E) MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqk0Jf9crFfpeea0xh9v8qiW7rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaGaamiuaiaacIcacaWGibGaaiiFaiaadweacaGGPaGaeyypa0ZaaSaaaeaacaWGqbGaaiikaiaadweacaGG8bGaamisaiaacMcacqGHflY1caWGqbGaaiikaiaadIeacaGGPaaabaGaamiuaiaacIcacaWGfbGaaiykaaaaaaa@4AE2@

This model has been especially effective in DNA interpretation, where the probability of a random match must be carefully evaluated [1,3,7]. Bayesian decision frameworks also underpin fingerprint and voice pattern analysis, where probabilistic thresholds guide evaluative conclusions [8,9].

In courts, Bayesian models encourage the use of likelihood ratios and discourage binary “yes/no” testimony, promoting transparency in forensic reporting [2,6].

2. Geometric and pattern-based analysis: Geometric modeling is foundational in reconstructing physical events from spatial evidence. Bloodstain Pattern Analysis (BPA), for instance, uses trigonometric models to estimate the angle and velocity of impact, enabling the determination of origin points [10,11]. The angle of impact θ is derived from the ellipse formed by a blood droplet:

θ=arcsin minoraxis majoraxis MathType@MTEF@5@5@+=feaaguart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLnhiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr4rNCHbGeaGqk0Jf9crFfpeea0xh9v8qiW7rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaGaeqiUdeNaeyypa0JaciyyaiaackhacaGGJbGaai4CaiaacMgacaGGUbWaaSaaaeaaieGacaWFTbGaa8xAaiaa=5gacaWGVbGaamOCaiaadggacaWG4bGaamyAaiaadohaaeaacaWGTbGaamyyaiaadQgacaWGVbGaamOCaiaadggacaWG4bGaamyAaiaadohaaaaaaa@516C@

In firearms examination, mathematical modeling informs bullet trajectory analysis to identify shooter positioning [12]. Similarly, affine transformations are employed in bite mark analysis to match dental impressions to suspects [13].

These techniques rely on spatial precision and geometrical relationships, making mathematics indispensable in the objective reconstruction of crime scenes [14].

3. Computational techniques and machine learning: The integration of Artificial Intelligence (AI) and Machine Learning (ML) into forensic workflows enables automation of tasks previously reliant on human interpretation. Image recognition algorithms assist in facial recognition, weapon detection, and trace comparison, often through Convolutional Neural Networks (CNNs) [15].

Natural Language Processing (NLP) is applied in document forensics, analyzing authorship, detecting tampering, or profiling suspects based on linguistic patterns [16]. These systems are underpinned by mathematical structures—linear algebra, probability theory, and optimization algorithms [17].

Beyond classification, AI models also support pattern recognition in digital evidence and aid in real-time forensic decision-making [18].

4. Graph theory and network analysis: Graph theory provides a mathematical basis for modeling relational data in forensic intelligence. Individuals, devices, or events are treated as nodes, while communications or transactions form edges connecting them. Analytical metrics such as degree centrality, betweenness, and community detection expose key actors and hidden structures within criminal networks [19].

This approach is critical in cybercrime investigations and organized crime mapping, where relational complexity is high [20,21]. Network modeling supports visual and algorithmic reconstruction of operations, improving law enforcement strategy and evidence presentation.

5. Differential equations in death and decomposition modeling: Mathematical models based on differential equations are essential for estimating Postmortem Intervals (PMI). Newton’s Law of Cooling models body temperature decay:

T(t) = Tambient + (T0 − Tambient) · e−kt

This formula approximates time of death based on temperature differentials and environmental conditions [22]. In advanced cases, decomposition models incorporate multiple variables—humidity, insect activity, and microbial succession—captured by more complex dynamic systems [23].

These calculations provide courts with scientifically grounded temporal estimates, especially when eyewitness accounts or surveillance data are absent.

! Mathematics plays a central role in strengthening the objectivity and reproducibility of forensic work. Disciplines such as trigonometry, probability theory, graph theory, and machine learning are increasingly embedded into forensic protocols [14].

Statistical Modeling and Bayesian Inference:These techniques provide a foundation for evaluating forensic evidence, especially in DNA analysis [7], fingerprinting, and voice recognition [8,9]. Probabilistic modeling ensures transparency in communicating uncertainty to courts [4,6].

Geometric and Pattern-Based Analysis: Bloodstain pattern analysis, bullet trajectory, and biometric comparisons all rely on geometric modeling and spatial measurements [10-13].

Computational Techniques and Machine Learning: Artificial intelligence and natural language processing aid in automating complex tasks such as document analysis, image classification, and digital trace interpretation [15,17].

Graph Theory and Network Analysis: These tools assist in mapping suspect networks and uncovering hidden connections in organized crime or cyber investigations [19,20].

Differential Equations: Thermal models, including Newton’s Law of Cooling, estimate time of death and decomposition rates under varying environmental conditions [22,23].

Forensic cases: Mathematical applications in forensic investigations

Mathematics has played an indispensable role in numerous high-profile forensic investigations and courts. Table 1 summarizes twenty-eight verified forensic cases in which mathematical techniques played a pivotal role in evidence interpretation. These cases span multiple domains, including DNA profiling, bite mark analysis, digital forensics, and crime scene reconstruction. Key mathematical tools include Bayesian inference, likelihood ratios, geometric morphometrics, and Markov Chain Monte Carlo (MCMC) simulations. For instance, Bayesian genotyping was used in post victim identification, while Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) aided in detecting deepfake media. Other applications covered arson modeling, pharmacokinetics in overdose cases, and kinship inference in mass grave identifications. Collectively, these cases underscore the cross-disciplinary value of mathematics in enhancing forensic objectivity and scientific rigor [2,13,16,18,21,24,25,27-29,32,35,38,39].

Discussion

Mathematical modeling has become a cornerstone in the practice of forensic science, offering a structured and objective means to interpret complex forms of evidence. Across a wide array of forensic domains—from DNA profiling and biometric comparisons to fire scene reconstruction and digital evidence analysis—mathematical techniques provide tools not only for analysis but also for validation, uncertainty estimation, and courtroom communication. The reviewed cases and methodologies demonstrate that these tools are not merely technical aids but essential components in achieving scientifically sound and legally defensible conclusions.

One of the most transformative contributions of mathematics to forensic science has been the integration of Bayesian statistics into evidence evaluation. In cases such as R v. Adams, Colin Pitchfork, and People v. Turner, Bayesian likelihood ratios were employed to assess the probability of DNA matches under competing hypotheses [1,7,8]. Unlike categorical judgments of ‘match’ or ‘no match’, Bayesian methods allow forensic scientists to articulate the strength of evidence on a probabilistic scale. This approach increases transparency, especially when combined with population allele frequency data to quantify random match probabilities [3,34]. The likelihood ratio framework has now been widely adopted in forensic DNA interpretation and is increasingly applied to other trace evidence domains such as footwear marks and glass fragments [32,36].

Beyond probabilistic modeling, geometric and statistical methods have played a critical role in the analysis of physical patterns and impressions. The wrongful conviction of Ray Krone, based on erroneous bitemark evidence, highlighted the limitations of subjective pattern analysis and the need for quantifiable reliability metrics [13]. Affine transformations and morphometric modeling now allow for more rigorous comparisons of dental impressions by accounting for geometric distortion, thereby reducing false positives. Similarly, in bloodstain pattern analysis, trigonometric models enable analysts to calculate the angle of impact of blood droplets based on the geometry of the stain [10,11]. These methods have improved the scientific basis of pattern analysis and underscore the need for continued training and validation.

Errors in the application or interpretation of mathematical reasoning have also had profound consequences in forensic contexts. The People v. Collins case is frequently cited for its misuse of conditional probability and the so-called prosecutor’s fallacy—where the probability of observing the evidence given innocence is misinterpreted as the probability of innocence given the evidence [2,31]. A similar statistical misstep contributed to the wrongful conviction of Sally Clark, where the assumption of independence between two sudden infant deaths led to an erroneously low joint probability estimate [6,29]. These cases emphasize the importance of statistical literacy among legal professionals and the necessity of expert testimony that can clearly explain probabilistic concepts to lay audiences [36].

Recent advancements in Artificial Intelligence (AI) and Machine Learning (ML) have brought new opportunities—and challenges—to forensic science. Techniques such as Convolutional Neural Networks (CNNs) and Natural Language Processing (NLP) are now used in areas ranging from facial recognition and handwriting analysis to deepfake detection, authorship attribution [15,17]. For example, in the Karen Read case, digital timeline reconstruction relied on time-series analysis and device metadata to validate or dispute alibis [17,18].

Likewise, ML models using Principal Component Analysis (PCA) and Singular Value Decomposition (SVD) have been developed to detect manipulated media by identifying statistical anomalies across video frames [30]. However, the opacity of some AI systems raises concerns about interpretability, algorithmic bias, and reproducibility especially when such tools are presented in adversarial court settings [5,20].

Mathematics also plays a central role in postmortem investigations. Differential equations, particularly those modeling thermal decay (e.g., Newton’s Law of Cooling), are used to estimate Postmortem Intervals (PMI) based on the rate of body temperature decline [22]. However, as demonstrated in the Jill Dando and Harold Shipman investigations, environmental variables such as ambient temperature, humidity, and body mass can significantly alter decomposition rates [23,30]. This highlights the necessity for interdisciplinary collaboration and continual model refinement using empirical data from real-world death scenes.

The emergence of genetic genealogy as an investigative tool has further expanded the role of mathematics in forensic identification. The Golden State Killer case is a prominent example where kinship coefficients and pedigree likelihood models were used to match crime scene DNA with distant relatives in genealogical databases [33,34]. Bayesian networks and combinatorial algorithms also facilitated victim identification efforts in Bosnia’s mass graves, where degraded DNA and incomplete family trees presented severe analytical challenges [35]. These methodologies reflect the growing need for computational tools that can handle complex, multi-generational relationship data with precision.

Control charts, sequential tests, and anomaly detection models have also been used to detect irregular mortality patterns in healthcare settings. The Harold Shipman case exemplifies how statistical monitoring of death rates using CUSUM (Cumulative Sum Control Charts) can flag abnormal trends before traditional qualitative methods detect a pattern [30]. In overdose cases, pharmacokinetic equations modeling drug absorption, distribution, metabolism, and excretion (ADME processes) are employed to reconstruct toxicological timelines, offering insight into the likely time and cause of death [33].

Despite these advancements, a persistent gap in statistical literacy among judges, attorneys, and jurors remains a critical issue [31,36]. Complex mathematical results—such as a likelihood ratio or an error probability—may be misunderstood or misrepresented without proper expert interpretation. Therefore, the presence of trained forensic statisticians and clearer, more structured forms of expert testimony are increasingly essential to avoid miscarriages of justice and uphold the integrity of legal outcomes.

As adversarial strategies grow more sophisticated and forensic evidence becomes increasingly digitized and data-rich, the importance of robust, transparent, and validated mathematical models cannot be overstated. Forensic mathematics must evolve in parallel with developments in data science, technology, and legal standards to remain both scientifically valid and legally reliable. Future directions may include the formal certification of forensic algorithms, the development of explainable AI models for court presentation, and the expansion of mathematical literacy in forensic education programs.

In sum, mathematics in forensic science is not merely an ancillary tool but a foundational pillar of modern evidence analysis. Its proper application enhances objectivity, reduces bias, improves transparency, and ultimately strengthens the judicial process.

The reviewed cases and methodologies affirm the indispensable value of mathematical tools in forensic science. Quantitative models not only enhance objectivity but also increase the admissibility and credibility of forensic evidence in legal contexts [40]. Bayesian statistics, for example, has revolutionized the communication of uncertainty in evidence interpretation by providing likelihood ratios rather than categorical yes/no judgments [26]. The incorporation of probability and statistical rigor has improved transparency, especially in DNA profiling and trace analysis [4,29].

Machine learning and artificial intelligence tools have increasingly become central in forensic workflows—from predictive policing and crime linkage to document verification and biometric recognition [15,30]. However, these tools require rigorous validation and ethical oversight to prevent algorithmic bias and ensure reproducibility. Similarly, differential equation models used for postmortem interval estimation must account for varying environmental parameters, require continuous refinement and interdisciplinary validation [23]. Another emerging concern is the gap in statistical literacy among legal professionals and jurors, which can lead to misinterpretation of probabilistic findings [31,40]. The need for forensic statisticians and clearer expert testimony has thus become critical. Furthermore, forensic mathematics must evolve alongside technological advancements, increasing data complexity, and evolving adversarial techniques to remain robust and courtroom-reliable.

Recommendations

To strengthen the reliability, transparency, and legal robustness of forensic science, the following recommendations are proposed based on the findings and case analyses presented in this study:

Require method validation and performance auditing: Mathematical and algorithmic methods used in forensic investigations—particularly AI and machine learning applications—must undergo rigorous validation before courtroom use. Techniques such as fingerprint matching, facial recognition, or deepfake detection should be benchmarked using large, representative datasets and independently audited for reproducibility, bias, and accuracy [15,20]. Regulatory frameworks akin to those used in biomedical device validation could be adapted to ensure algorithmic tools meet evidentiary standards.

Promote interdisciplinary collaboration in model development: Postmortem interval estimation, genetic genealogy, and digital reconstruction often require integrating data from pathology, anthropology, environmental science, and computer science. The differential equation models used for estimating time of death, for instance, must be calibrated with real-world environmental data [22,23]. Interdisciplinary research initiatives and funding should support collaborative model development that reflects forensic complexity instead of relying on static or overly simplistic assumptions.

Standardize communication of uncertainty in expert testimony: The presentation of statistical evidence in court must be standardized to minimize ambiguity and enhance juror comprehension. Guidelines for expert witnesses should encourage the use of visual tools such as probabilistic charts, confidence intervals, and likelihood ratio scales to explain findings clearly. Just as the introduction of the likelihood ratio framework improved the communication of DNA evidence [8], similar efforts are needed across other forensic domains, including bite mark analysis, trace evidence, and digital forensics.

Develop open, transparent, and explainable forensic algorithms: Especially in machine learning-driven evidence evaluation, transparency is critical. Black-box algorithms are difficult to defend under cross-examination. Developers and forensic laboratories should prioritize explainable AI (XAI) methods that enable experts to provide justifiable, transparent decisions rooted in mathematical reasoning [17,30]. Open-source toolkits and publicly available validation datasets would foster accountability and reproducibility.

Institutionalize forensic statistical training: A recurring theme in miscarriages of justice—such as the Sally Clark and People v. Collins cases—is the misapplication or misinterpretation of basic probabilistic principles. Legal professionals, including judges and prosecutors, often lack formal training in statistical thinking [31,36]. It is therefore essential to incorporate mandatory statistical and probabilistic literacy modules in legal and forensic science curricula. Continuous professional education should be offered for practitioners and court officers to ensure they are capable of interpreting likelihood ratios, statistical error rates, and probabilistic findings accurately.

Encourage the role of dedicated forensic statisticians in casework: Finally, the integration of professional forensic statisticians into investigative and judicial workflows should be institutionalized. Their expertise is vital not only for analyzing complex evidence but also for advising on the design of forensic experiments, evaluating competing hypotheses, and ensuring that statistical arguments presented in court are scientifically and legally sound.

By implementing these recommendations, forensic science can better align itself with evolving legal expectations and scientific standards, ultimately reducing the risk of error and enhancing public trust in the justice system.

Conclusion

Mathematics strengthens forensic science by introducing structure, analytical rigor, and empirical accountability. Its application strengthens evidence interpretation and supports more reliable judicial outcomes. The integration of mathematical approaches in forensic science is no longer optional—it is foundational. As illustrated through both methodology and high-impact casework, mathematics enhances every phase of forensic analysis, from evidence acquisition and interpretation to courtroom presentation. These tools enhance the objectivity, reproducibility, and scientific defensibility of forensic conclusions. Looking forward, continuous interdisciplinary collaboration between mathematicians, forensic scientists, computer scientists, and legal experts will be key. Efforts must focus on methodological transparency, ethical deployment of machine learning, standardization of probabilistic reporting, and improved statistical education for legal stakeholders [33,37,41]. As forensic demands and technologies evolve, so must the mathematical frameworks that underpin them, thus, ensuring justice through analytical rigor and methodological reliability.

Highlights

  • This study identifies five core mathematical approaches essential to forensic science: Bayesian inference, geometric modeling, machine learning, graph theory, and differential equations.
  • Twenty-eight high-profile forensic cases were analyzed to demonstrate the applied value of quantitative models in evidence interpretation.
  • Bayesian statistics and likelihood ratios are shown to enhance the transparency and probative strength of DNA evidence in court.
  • Geometric and morphometric models improve reliability in bloodstain pattern analysis, bullet trajectory estimation, and bitemark comparison.
  • Machine learning and AI tools support forensic tasks such as deepfake detection, digital timeline reconstruction, and authorship analysis, but require rigorous ethical oversight and empirical validation.
  • Differential equation models for postmortem interval estimation are highlighted with emphasis on environmental calibration requirements and cross-disciplinary validation needs.
  • Recommendations include enhancing statistical literacy in legal settings, mandating method validation, and embedding forensic statisticians systematically within casework processes.
  1. Bruno C. Mathematical approach in forensic science and use of probability in evaluation of evidence [Internet]. 2005. Available from: https://art.torvergata.it/handle/2108/94
  2. Aitken C. Statistics and forensic science. In: Fraser J, Williams R, editors. Handbook of forensic science. Cullompton (UK): Willan; 2013;387–418.
  3. Brenner CH. Fundamental problem of forensic mathematics—the evidential value of a rare haplotype. Forensic Sci Int Genet. 2010;4(4):281–91. Available from: https://doi.org/10.1016/j.fsigen.2009.10.013
  4. Xu X, Vinci G. Forensic science and how statistics can help it: Evidence, likelihood ratios, and graphical models. Wiley Interdiscip Rev Comput Stat. 2024;16(5):e70006. Available from: https://doi.org/10.1002/wics.70006
  5. Crispino F, Weyermann C, Delémont O, Roux C, Ribaux O. Towards another paradigm for forensic science? Wiley Interdiscip Rev Forensic Sci. 2022;4(3):e1441. Available from: https://doi.org/10.1002/wfs2.1441
  6. Lucy D. Introduction to statistics for forensic scientists. Chichester (UK): John Wiley & Sons; 2013.
  7. National Institute of Justice. DNA – A prosecutor’s practice notebook [Internet]. Washington (DC): U.S. Department of Justice; 2023. Available from: https://nij.ojp.gov/nij-hosted-online-training-courses/dna-prosecutors-practice-notebook-inventory/home
  8. Taroni F, Bozza S, Biedermann A, Garbolino P, Aitken C. Data analysis in forensic science: A Bayesian decision perspective [Internet]. Chichester (UK): John Wiley & Sons; 2010. Available from: https://www.ndl.ethernet.edu.et/bitstream/123456789/29062/1/Franco%20Taroni_2010.pdf
  9. Evett IW, Berger CEH, Buckleton JS, Champod C, Jackson G. Finding the way forward for forensic science in the US—A commentary on the PCAST report. Forensic Sci Int. 2017;278:16–23. Available from: https://doi.org/10.1016/j.forsciint.2017.06.018
  10. Makovický P, Horáková P, Slavík P, Mošna F, Pokorná O. The use of trigonometry in bloodstain analysis. Soud Lek. 2013;58(2):20–5. Available from: https://pubmed.ncbi.nlm.nih.gov/23641723/
  11. Joris P, Develter W, Jenar E, Suetens P, Vandermeulen D, Van de Voorde W, et al. Calculation of bloodstain impact angles using an Active Bloodstain Shape Model. J Forensic Radiol Imaging. 2014;2(4):188–98. Available from: https://doi.org/10.1016/j.jofri.2014.09.004
  12. Zalewski EN. Mathematics in forensic firearm examination [Internet]. Syracuse (NY): Syracuse University; 2015. Available from: https://surface.syr.edu/honors_capstone/837/
  13. Sheets HD, Bush MA. Mathematical matching of a dentition to bitemarks: Use and evaluation of affine methods. Forensic Sci Int. 2011;207(1–3):111–8. Available from: https://doi.org/10.1016/j.forsciint.2010.09.013
  14. Kaviya B. Mathematics in forensic science. Louis Savinien Dupuis J Multidiscip Res. 2024;3:154–8. Available from: https://doi.org/10.21839/lsdjmr.2024.v3.121
  15. Chinnikatti SK. Artificial intelligence in forensic science. Forensic Sci Addict Res. 2018;2(5):000554. Available from: http://dx.doi.org/10.31031/FSAR.2018.03.000554
  16. Bird CL, Yang X. Forensic document examination: A global snapshot. Forensic Sci Res. 2025;owaf001. Available from: https://doi.org/10.1093/fsr/owaf001
  17. Pyarelal KM. Mathematical basics as a prerequisite to artificial intelligence in forensic analysis. In: Numerical Simulation – Advanced Techniques for Science and Engineering [Internet]. London: IntechOpen; 2022. Available from: https://www.intechopen.com/chapters/85019
  18. Novak M, Grier J, Gonzales D. New approaches to digital evidence acquisition and analysis. NIJ J. 2018;280:1–8. Available from: https://www.ojp.gov/library/publications/new-approaches-digital-evidence-acquisition-and-analysis
  19. Baechler S, Morelato M, Gittelson S, Walsh S, Margot P, Roux C, et al. Breaking the barriers between intelligence, investigation and evaluation: A continuous approach to define the contribution and scope of forensic science. Forensic Sci Int. 2020;309:110213. Available from: https://doi.org/10.1016/j.forsciint.2020.110213
  20. Biedermann A, Bozza S, Taroni F. Normative decision analysis in forensic science. Artif Intell Law. 2020;28:7–25. Available from: https://link.springer.com/article/10.1007/s10506-018-9232-2
  21. Lopez BE, McGrath JG, Taylor VG. Using forensic intelligence to combat serial and organized violent crimes. NIJ J. 2020;282:1–11. Available from: https://www.ojp.gov/pdffiles1/nij/254471.pdf
  22. Vidoli GM. New method for measuring human decomposition could significantly impact medicolegal death investigations [Internet]. Washington (DC): National Institute of Justice; 2021. Available from: https://nij.ojp.gov/topics/articles/new-method-measuring-human-decomposition-could-impact-medicolegal-death-investigations
  23. Casali MB, Blandino A, Grignaschi S, Florio EM, Travaini G, Genovese UR. The pathological diagnosis of the height of fatal falls: A mathematical approach. Forensic Sci Int. 2019;302:109883. Available from: https://doi.org/10.1016/j.forsciint.2019.109883
  24. National Institute of Justice [Internet]. Washington (DC): U.S. Department of Justice;. Available from: https://nij.ojp.gov/
  25. Perlin MW, Szabady B. Linear mixture analysis: A mathematical approach to resolving mixed DNA samples. J Forensic Sci. 2001;46(6):1372–8. Available from: https://doi.org/10.1520/JFS15158J
  26. Taroni F, Bozza S, Biedermann A, Garbolino P, Aitken C. Data analysis in forensic science: A Bayesian decision perspective [Internet]. Chichester (UK): John Wiley & Sons; 2010. Available from: https://www.ndl.ethernet.edu.et/bitstream/123456789/29062/1/Franco%20Taroni_2010.pdf
  27. National Institute of Justice. Geographic profiling in serial crimes [Internet]. Washington (DC): U.S. Department of Justice; 2020. Available from: https://nij.ojp.gov/topics/articles
  28. Crispino F, Ribaux O, Houck M, Margot P. Forensic science–A true science? Aust J Forensic Sci. 2011;43(2–3):157–76. Available from: https://doi.org/10.1080/00450618.2011.555416
  29. Curran JM. Statistics in forensic science. Wiley Interdiscip Rev Comput Stat. 2009;1(2):141–56. Available from: https://doi.org/10.1002/wics.33
  30. Sharma MT, Sharma MS. Solving crimes with numbers: Fusion of mathematics and data analytics. In: Forensic Innovations in Criminal Investigations. 2025;67.
  31. Redmayne M, Roberts P, Aitken C, Jackson G. Forensic science evidence in question. Crim Law Rev. 2011;5:347–56. Available from: https://rke.abertay.ac.uk/en/publications/forensic-science-evidence-in-question
  32. Biedermann A, Taroni F, Champod C. How to assign a likelihood ratio in a footwear mark case: An analysis and discussion in the light of R v T. Law Probab Risk. 2012;11:259–77. Available from: https://doi.org/10.1093/lpr/mgs015
  33. Zakaria A, El-Minshawi M, Magdy M, Mahmoud M, Mohamed G, Morgan N, et al. Bridging the gap: interdisciplinary approaches to mathematical applications. 2024;1(1):147–58. Available from: https://journals.ekb.eg/article_368781_25005f046b624f1b802ef2b092095695.pdf
  34. Buckleton J. Bayesian networks in forensic DNA interpretation. Boca Raton (FL): CRC Press; 2016.
  35. Cenanović M, Pojskic N, Kovacevic L, Dzehverovic M, Cakar J, Musemic D, et al. Diversity of Y-short tandem repeats in the representative sample of the population of Canton Sarajevo residents, Bosnia and Herzegovina. Coll Antropol. 2010;34(2):545–50. Available from: https://pubmed.ncbi.nlm.nih.gov/20698129/
  36. Evett IW. A Bayesian approach to the problem of interpreting glass evidence in forensic science casework. J Forensic Sci Soc. 1986;26(1):3–18. Available from: https://doi.org/10.1016/s0015-7368(86)72441-9
  37. Toneva D, Nikolova S, Harizanov S, Zhelev I. Applied mathematics for forensic medicine. In: Mathematics of Life MoL2021. Sofia (BG): Institute of Mathematics and Informatics, Bulgarian Academy of Sciences; 2021;24. Available from: https://parallel.bas.bg/~nevena/mol2020/images/Book_of_Abstracts_online.pdf#page=32
  38. Koterová A, Navega D, Stepanovský M, Buk Z, Brůžek J, Cunha E. Age estimation of adult human remains from hip bones using advanced methods. Forensic Sci Int. 2018;287:163–75. Available from: https://doi.org/10.1016/j.forsciint.2018.03.047
  39. Hicklin RA, Winer KR, Kish PE, Parks CL, Chapman W, Dunagan K, et al. Accuracy and reproducibility of conclusions by forensic bloodstain pattern analysts. Forensic Sci Int. 2021;325:110856. Available from: https://doi.org/10.1016/j.forsciint.2021.110856
  40. Ligertwood A, Edmond G. Expressing evaluative forensic science opinions in a court of law. Law Probab Risk. 2012;11(4):289–302. Available from: https://doi.org/10.1093/lpr/mgs016
  41. Colón JMT. A model of interdisciplinary approaches with math, research, robotics, and forensic sciences: The UNE R³-STEM project. Int J Educ Excell. 2021;7(2):97–110. Available from: https://documento.uagm.edu/cupey/ijee/ijee_7_2.pdf#page=99
 

Help ?