Yttrium-90 (90Y) radioembolization is a minimally invasive procedure increasingly used for advanced liver cancer treatment. In this method, radioactive microspheres are injected into the hepatic arterial bloodstream to target, irradiate, and kill cancer cells. Accurate and precise treatment planning can lead to more efficient and safer treatment by delivering a higher radiation dose to the tumor while minimizing the exposure of the surrounding liver parenchyma. Treatment planning primarily relies on the estimated radiation dose delivered to tissue. However, current methods used to estimate the dose are based on simplified assumptions that make the dosimetry results unreliable. In this work, we present a computational model to predict the radiation dose from the 90Y activity in different liver segments to provide a more realistic and personalized dosimetry. Computational fluid dynamics (CFD) simulations were performed in a 3D hepatic arterial tree model segmented from cone-beam CT angiographic data obtained from a patient with hepatocellular carcinoma (HCC). The microsphere trajectories were predicted from the velocity field. 90Y dose distribution was then calculated from the volumetric distribution of the microspheres. Two injection locations were considered for the microsphere administration, a lobar and a selective injection. Results showed that 22% and 82% of the microspheres were delivered to the tumor, after each injection, respectively, and the combination of both injections ultimately delivered 49% of the total administered 90Y microspheres to the tumor. Results also illustrated the nonhomogeneous distribution of microspheres between liver segments, indicating the importance of developing patient-specific dosimetry methods for effective radioembolization treatment.