The self-absorption effect due to optically thick property greatly influences the measured line intensities as well as the performance of quantification for laser-induced breakdown spectroscopy (LIBS) especially for calibration-free LIBS which requires proper correction. In this paper, a new self-absorption correction method for Calibration-Free LIBS (CF-LIBS), called blackbody radiation referenced self-absorption correction (BRR-SAC) is proposed. An iterative algorithm was designed to calculate the plasma temperature and normally hard-to-obtain collection efficiency of the optical collection system by directly comparing the measured spectrum with the corresponding theoretical blackbody radiation for self-absorption correction. Compared with generally applied self-absorption correction methods based on the principle of curve of growth, the proposed method has obvious advantages of simpler programming, higher computation efficiency, and its independency of the availability or accuracy of line broadening coefficients. Experiments were conducted on titanium alloy samples. The experimental results showed that the self-absorption was corrected with increased linearity of the Boltzmann plots and the measurement accuracy of the elemental concentration was significantly improved through BRR-SAC. Compared with the traditional CF-LIBS with self-absorption correction, the proposed method also showed better performance. In addition, BRR-SAC provides a simple way to obtain the collection efficiency of the experimental setup, which benefits the plasma diagnostics and quantitative analysis.