您的位置:首页 > 文旅 > 旅游 > discuz论坛应用中心_澄海手工外发加工网_长沙做搜索引擎的公司_长尾词seo排名优化

discuz论坛应用中心_澄海手工外发加工网_长沙做搜索引擎的公司_长尾词seo排名优化

2025/5/1 2:46:53 来源:https://blog.csdn.net/2401_87849335/article/details/145653036  浏览:    关键词:discuz论坛应用中心_澄海手工外发加工网_长沙做搜索引擎的公司_长尾词seo排名优化
discuz论坛应用中心_澄海手工外发加工网_长沙做搜索引擎的公司_长尾词seo排名优化

在Python中,异常处理通常使用try-except语句块来实现。你可以捕获特定的异常类型,也可以捕获通用异常。

1. 捕获特定异常

针对常见的网络请求异常和解析异常,可以捕获具体的异常类型,例如requests.exceptions.RequestExceptionAttributeError等。

示例代码:
import requests
from bs4 import BeautifulSoupdef fetch_page(url):try:response = requests.get(url, timeout=10)  # 设置超时时间response.raise_for_status()  # 检查HTTP响应状态码return response.textexcept requests.exceptions.RequestException as e:print(f"网络请求失败: {e}")except Exception as e:print(f"发生未知异常: {e}")return Nonedef parse_page(html):if not html:return []try:soup = BeautifulSoup(html, 'html.parser')items = soup.find_all('div', class_='item')data = []for item in items:title = item.find('h2').text.strip()price = item.find('span', class_='price').text.strip()data.append({'title': title, 'price': price})return dataexcept AttributeError as e:print(f"HTML解析失败: {e}")return []# 示例用法
url = "https://example.com"
html = fetch_page(url)
if html:data = parse_page(html)print(data)

2. 使用日志记录异常

在生产环境中,建议使用logging模块记录异常信息,以便后续分析和排查问题。

示例代码:
import logginglogging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')def fetch_page(url):try:response = requests.get(url, timeout=10)response.raise_for_status()return response.textexcept requests.exceptions.RequestException as e:logging.error(f"网络请求失败: {e}")except Exception as e:logging.error(f"发生未知异常: {e}")return None

3. 重试机制

对于网络请求失败的情况,可以设置重试机制,提高爬虫的鲁棒性。

示例代码:
import time
from requests.exceptions import RequestExceptiondef fetch_page_with_retry(url, max_retries=3):retries = 0while retries < max_retries:try:response = requests.get(url, timeout=10)response.raise_for_status()return response.textexcept RequestException as e:retries += 1logging.warning(f"请求失败,正在重试 ({retries}/{max_retries}): {e}")time.sleep(2)  # 等待2秒后重试logging.error(f"重试次数已达上限,放弃请求")return None

Java 爬虫中的异常处理

在Java中,异常处理通常使用try-catch语句块来实现。你可以捕获特定的异常类型,例如IOExceptionParseException等。

1. 捕获特定异常

针对常见的网络请求异常和解析异常,可以捕获具体的异常类型。

示例代码:
import java.io.IOException;
import java.net.HttpURLConnection;
import java.net.URL;
import java.util.Scanner;public class WebScraper {public static String fetchPage(String url) {try {HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();connection.setRequestMethod("GET");connection.setConnectTimeout(10000); // 设置连接超时connection.setReadTimeout(10000);    // 设置读取超时int responseCode = connection.getResponseCode();if (responseCode == HttpURLConnection.HTTP_OK) {Scanner scanner = new Scanner(connection.getInputStream());StringBuilder response = new StringBuilder();while (scanner.hasNext()) {response.append(scanner.nextLine());}scanner.close();return response.toString();} else {System.out.println("请求失败,状态码: " + responseCode);}} catch (IOException e) {System.err.println("网络请求失败: " + e.getMessage());} catch (Exception e) {System.err.println("发生未知异常: " + e.getMessage());}return null;}public static void main(String[] args) {String url = "https://example.com";String html = fetchPage(url);if (html != null) {System.out.println(html);}}
}

2. 使用日志记录异常

在生产环境中,建议使用日志框架(如Log4jSLF4J)记录异常信息。

示例代码:
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;public class WebScraper {private static final Logger logger = LogManager.getLogger(WebScraper.class);public static String fetchPage(String url) {try {HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();connection.setRequestMethod("GET");connection.setConnectTimeout(10000);connection.setReadTimeout(10000);int responseCode = connection.getResponseCode();if (responseCode == HttpURLConnection.HTTP_OK) {Scanner scanner = new Scanner(connection.getInputStream());StringBuilder response = new StringBuilder();while (scanner.hasNext()) {response.append(scanner.nextLine());}scanner.close();return response.toString();} else {logger.error("请求失败,状态码: " + responseCode);}} catch (IOException e) {logger.error("网络请求失败: " + e.getMessage());} catch (Exception e) {logger.error("发生未知异常: " + e.getMessage());}return null;}public static void main(String[] args) {String url = "https://example.com";String html = fetchPage(url);if (html != null) {logger.info(html);}}
}

3. 重试机制

对于网络请求失败的情况,可以设置重试机制,提高爬虫的鲁棒性。

示例代码:
import java.io.IOException;
import java.net.HttpURLConnection;
import java.net.URL;
import java.util.Scanner;public class WebScraper {public static String fetchPageWithRetry(String url, int maxRetries) {int retries = 0;while (retries < maxRetries) {try {HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();connection.setRequestMethod("GET");connection.setConnectTimeout(10000);connection.setReadTimeout(10000);int responseCode = connection.getResponseCode();if (responseCode == HttpURLConnection.HTTP_OK) {Scanner scanner = new Scanner(connection.getInputStream());StringBuilder response = new StringBuilder();while (scanner.hasNext()) {response.append(scanner.nextLine());}scanner.close();return response.toString();} else {logger.error("请求失败,状态码: " + responseCode);}} catch (IOException e) {retries++;logger.warn("请求失败,正在重试 (" + retries + "/" + maxRetries + "): " + e.getMessage());try {Thread.sleep(2000); // 等待2秒后重试} catch (InterruptedException ie) {Thread.currentThread().interrupt();}} catch (Exception e) {logger.error("发生未知异常: " + e.getMessage());break;}}logger.error("重试次数已达上限,放弃请求");return null;}public static void main(String[] args) {String url = "https://example.com";String html = fetchPageWithRetry(url, 3);if (html != null) {logger.info(html);}}
}

总结

通过合理设置异常处理机制,可以有效提升爬虫的稳定性和可靠性。主要的异常处理策略包括:

  1. 使用try-catch捕获异常。

  2. 使用日志记录异常信息。

  3. 设置重试机制处理网络异常。

  4. 对不同类型的异常进行分类处理。

  5. finally块中清理资源。

在实际开发中,可以根据爬虫的具体需求和目标网站的特点,灵活调整异常处理策略,确保爬虫能够在复杂环境下稳定运行。

版权声明:

本网仅为发布的内容提供存储空间,不对发表、转载的内容提供任何形式的保证。凡本网注明“来源:XXX网络”的作品,均转载自其它媒体,著作权归作者所有,商业转载请联系作者获得授权,非商业转载请注明出处。

我们尊重并感谢每一位作者,均已注明文章来源和作者。如因作品内容、版权或其它问题,请及时与我们联系,联系邮箱:809451989@qq.com,投稿邮箱:809451989@qq.com