Behavior of animals living in the wild is often studied using visual observations made by trained experts. However, these observations tend to be used to classify behavior during discrete time periods and become more difficult when used to monitor multiple individuals for days or weeks. In this work, we present automatic tools to enable efficient behavior and dynamic state estimation/classification from data collected with animal borne bio-logging tags, without the need for statistical feature engineering. A combined framework of an long short-term memory (LSTM) network and a hidden Markov model (HMM) was developed to exploit sequential temporal information in raw motion data at two levels: within and between windows. Taking a moving window data segmentation approach, LSTM estimates the dynamic state corresponding to each window by parsing the contiguous raw data points within the window. HMM then links all of the individual window estimations and further improves the overall estimation. A case study with bottlenose dolphins was conducted to demonstrate the approach. The combined LSTM–HMM method achieved a 6% improvement over conventional methods such as K-nearest neighbor (KNN) and support vector machine (SVM), pushing the accuracy above 90%. In addition to performance improvements, the proposed method requires a similar amount of training data to traditional machine learning methods, making the method easily adaptable to new tasks.