We consider rather general partially observed models, which include continuous-time Markov processes with dicrete-time observations, and we introduce some new interacting Monte Carlo schemes to approximate the optimal filter and related contrast functions (such as likelihood function, conditional least-squares, etc.). Under suitable regularity assumptions on the coefficients w.r.t. the unknown parameter, we show that these Monte Carlo approximations are smooth (continuous, differentiable, etc.) w.r.t. the parameter, and we obtain uniform convergence results as the number of simulated interacting samples (particles) goes to infinity. We study some aspects of statistical prediction theory: prediction models, $p$-sufficiency, optimality, efficiency, rates of convergence. Some applications to Poisson and Ornstein-Uhlenbeck processes are given.