About Newsweek USA
Newsweek is an American weekly news magazine founded in 1933.
Newsweek was a widely distributed newsweekly through the 20th century, with many notable editors-in-chief throughout the years. Newsweek was acquired by The Washington Post Company in 1961, under whose ownership it remained until 2010.