Powered lower-limb prostheses feature a high-level intelligent control system, referred to as locomotion mode recognition (LMR), which enables seamless amputee-prosthesis interactions through activation of appropriate low-level controllers depending on the user’s gait intent and environment. Environmental and terrain conditions provide valuable subject-independent prior information about the upcoming locomotion modes, which enable the design of seamless and non-delayed LMR systems. The objective of this paper is to validate the feasibility of deep convolutional neural networks (CNNs) for distinguishing three environmental conditions: level walking, stair ascent, and stair descent. The CNN automates feature learning and extraction that in traditional models were hand-engineered. We construct an efficient CNN through transfer learning from a pre-trained model where input images are captured from seven able-bodied subjects during various indoor and outdoor daily-life walking tasks. A stand still detection algorithm is developed by means of an inertial measurement unit sensor to automate the task of image capture. To further enhance prediction performance, we incorporate the history of previously predicted environmental conditions and the categorical information about the environment property (e.g., number of steps in a staircase). The proposed environment recognition system achieves an overall accuracy of about 99% on the test data. Results verify the potential of CNN to accurately predict the environmental conditions that can be used individually or in combination with other sensors to design an accurate and robust LMR system for lower-limb amputees with powered prostheses.