Document Type : Original Research Paper

Authors

1 Mining Engineering Faculty, Sahand University of Technology, Tabriz, Iran.

2 Lead Associate, Delve Underground, Walnut Creek, CA, USA.

10.22044/jme.2026.17274.3421

Abstract

Back analysis of tunnel excavation plays a fundamental role in calibrating geomechanical parameters using field monitoring data. However, conventional direct back analysis procedures remain computationally demanding and highly dependent on operator supervision. This study presents an integrated Finite Difference Method–Genetic Algorithm (FDM–GA) framework for automated tunnel back analysis, implemented entirely within the FLAC environment using the embedded FISH programming language. The proposed approach eliminates the need for external optimization software and data transfer between numerical and artificial intelligence platforms. A simplified genetic algorithm is coupled directly with finite difference simulations to iteratively minimize the discrepancy between measured and computed tunnel convergences. The framework incorporates constrained parameter optimization, automated handling of non-convergent models, and a robust convergence-based stopping criterion that avoids predefined error thresholds. Verification is performed using two synthetic plane-strain tunnel models representing stiff cohesive soil and dense granular material. Six unknown parameters (ρ, E, ν, c, φ, and K0) are back-calculated using only three convergence measurements. Results from multiple independent runs demonstrate stable convergence toward very small error values (on the order of 10-6–10-5) and consistent reproduction of synthetic monitoring data. The method successfully narrows broad initial parameter ranges and produces multiple acceptable parameter sets, explicitly acknowledging the non-uniqueness inherent in back analysis problems. The developed FDM–GA framework provides an efficient, self-contained, and adaptable tool for practical tunnel back analysis applications.

Keywords

Main Subjects