The advent of computational science has unveiled large classes of nonlinear optimization problems where derivatives of the objective and/or constraints are unavailable. Often, these problems are posed as black-box optimization problems, but rarely is this by necessity. We report on our experience extracting additional structure on problems consisting of both black-box and algebraic or otherwise known components. We provide examples on calibration problems, knowing a subset of derivatives, and nonsmooth composite optimization. In each case, we use quadratic surrogates to model both the black-box and algebraic components to obtain new, globally convergent grey-box optimization methods.