You mission is to read a text file called exactly "url.txt" that contains URLs, and make sure each of those URLs works. For each URL you should print either "Good" or "Host down" or "Page Not Found". Your code MUST have a function called checkIt(), which takes a URL as argument and returns either a 0,1, or 2. You main code should 1) Open the URL file "url.txt" 2) For each line of the file call the function, and print the words. ---- Some helpful code ---- from urllib.request import Request, urlopen from urllib.error import URLError, HTTPError try: req = Request("http://www.111cn.net/") response = urlopen(req) text = response.read().decode("UTF-8") print(text[:300]) except HTTPError as e: # Connected, but no data was transferred print("Page not found", e) except URLError as e: # Could not connect print("Hotname bad", e) --------------------------- Other useful code --------------------------- from colorama import Fore, Back, Style print(Fore.RED + 'some red text') --------------------------- More other useful code --------------------------- import time print(time.time()) ------------------------- Version 2 ------------------------- Write a program that reads a file called url.txt and produces output like this: Notes: The difference between fast is less than 0.3 seconds. The times are rounded to 3 digits. Fast lines should be in green. Slow lines should be in yellow. Error lines should be in red. https://www.google.com - fast: 0.291 seconds. https://timesofindia.com - slow: 1.028 seconds. https://www.google.com/sjdfhksjfh - PAGE NOT FOUND! https://www.google.commmmmm - SITE DOWN!