Time complexity is a measure of the computational resources, specifically time, an algorithm requires to run as its input size grows. It quantifies how execution time scales, often expressed using Big O notation to describe its asymptotic behavior, and is crucial for evaluating an algorithm's efficiency.