Recent empirical research and special issues have focused on determining appropriate training program ranking metrics, including faculty productivity and pass rates. Arguably, an important way to judge the quality of a training program is by talking with its graduates. Field leaders have recognized the critical importance of student voice in understanding program quality (Iwata, 2015). However, to date no research has focused on understanding the program or supervision experiences of our behavior professionals. A mixed-methods survey was send to over 1,200 behavior analysts across Pennsylvania to understand a variety of professional issues, including questions about the strengths and needs of their training program. The respondents (n = 98) identified strengths of program design or characteristics (e.g., providing supervision), quality faculty and instructors, and effective instructional activities. Common programmatic needs included real-life applications of skills, specific content(s) (e.g., verbal behavior), and program organization. Interestingly, reported strengths and needs differed by the programs’ method of instruction (on-campus, hybrid, and online). Results from this survey will add an important missing voice into our fields’ conversations regarding training program quality and can provide critically important information for those responsible for training and supervising the next generation of behavior analysts.