Forecasting of armed conflicts is a critical area of research with the potential to save lives and mitigate suffering. While existing forecasting models offer valuable point predictions, they often lack individual-level uncertainty estimates, limiting their usefulness for decision-making. Several approaches exist to estimate uncertainty, such as parametric and Bayesian prediction intervals, bootstrapping, quantile regression, but these methods often rely on restrictive assumptions, struggle to provide well-calibrated intervals across the full range of outcomes, or are computationally intensive. Conformal prediction offers a model-agnostic alternative that guarantees a user-specified level of coverage but typically provides only marginal coverage, potentially resulting in non-uniform coverage across different regions of the outcome space. In this article, we introduce a novel extension called bin-conditional conformal prediction (BCCP), which enhances standard conformal prediction (SCP) by ensuring consistent coverage rates across user-defined subsets (bins) of the outcome variable. We apply BCCP to simulated data as well as the forecasting of fatalities from armed conflicts, and demonstrate that it provides well-calibrated uncertainty estimates across various ranges of the outcome. Compared to SCP, BCCP offers improved local coverage, though this comes at the cost of slightly wider prediction intervals.