不使用第三方框架获取html页面某个标签的某个属性值

大多数情况下,我们要获取页面源码等html代码中的某个标签的某个属性值时,不可能大费周章的去使用第三方框架,这就有点牛刀杀鸡的赶脚了。遇到这种情况我们可以使用简单的正则表达式来提取我们想要的数据。
例如,我要下个连续剧,大多数网站提供给我们的是很多列表,需要我们去一个一个去点击,这就很麻烦,现在,只需要我们使用http获取到页面的html字符串,然后使用正则表达式批量获取到a标签的href属性的值就可以了。
现在我们以某电影网站某雷下载为例,现在demo如下:
工具方法httpSendGet:

public static String httpSendGet(String url, String param,String charsetName) {
        String result = "";
        BufferedReader in = null;
        try {
            String urlNameString = url + "?" + param;
            URL realUrl = new URL(urlNameString);
            // 打开和URL之间的连接
            URLConnection connection = realUrl.openConnection();
            // 设置通用的请求属性
            connection.setRequestProperty("accept", "*/*");
            connection.setRequestProperty("connection", "Keep-Alive");
            connection.setRequestProperty("user-agent",
                    "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;SV1)");
            // 建立实际的连接
            connection.connect();
            // 定义 BufferedReader输入流来读取URL的响应
            in = new BufferedReader(new InputStreamReader(
                    connection.getInputStream(), charsetName));
            String line;
            while ((line = in.readLine()) != null) {
                result += line;
            }
        } catch (Exception e) {
            System.out.println("发送GET请求出现异常!" + e);
            e.printStackTrace();
        }
        // 使用finally块来关闭输入流
        finally {
            try {
                if (in != null) {
                    in.close();
                }
            } catch (Exception e2) {
                e2.printStackTrace();
            }
        }
        return result;
    }

工具方法match:

public static List<String> match(String source, String element, String attr) {
        List<String> result = new ArrayList<String>();
        String reg = "<" + element + "[^<>]*?\\s" + attr + "=['\"]?(.*?)['\"]?(\\s.*?)?>";
        Matcher m = Pattern.compile(reg).matcher(source);
        while (m.find()) {
            String r = m.group(1);
            result.add(r);
        }
        return result;
    }

调用demo:

public static void main(String[] args) {
        String url = "https://www.dy2018.com/i/99671.html";
        String params = "";
        String html = httpSendGet(url,params,"gb2312");
        List<String> links = match(html,"a","href");
        System.out.println(links);

    }

这里需要说明一下,httpSendGet的charsetName参数需要注意,不然你获取的html文本会是乱码。
最后展示一下结果(当然结果还不纯粹,需要过滤):
[/, /2/, /0/, /3/, /1/, /4/, /8/, /5/, /7/, /14/, /15/, /html/tv/hytv/index.html, /html/tv/oumeitv/index.html, /html/tv/rihantv/index.html, /html/zongyi2013/index.html, /html/dongman/index.html, /support/GuestBook.php, #, index.html, /, /html/tv/, /html/tv/hytv/, javascript:window.external.addFavorite('http://www.dy2018.com/','dy2018.com-电影天堂')"class="style11, /webPlay/play-id-99671-collection-37.html, /webPlay/play-id-99671-collection-36.html, /webPlay/play-id-99671-collection-35.html, /webPlay/play-id-99671-collection-34.html, /webPlay/play-id-99671-collection-33.html, /webPlay/play-id-99671-collection-32.html, /webPlay/play-id-99671-collection-31.html, /webPlay/play-id-99671-collection-30.html, /webPlay/play-id-99671-collection-29.html, /webPlay/play-id-99671-collection-28.html, /webPlay/play-id-99671-collection-27.html, /webPlay/play-id-99671-collection-26.html, /webPlay/play-id-99671-collection-25.html, /webPlay/play-id-99671-collection-24.html, /webPlay/play-id-99671-collection-23.html, /webPlay/play-id-99671-collection-22.html, /webPlay/play-id-99671-collection-21.html, /webPlay/play-id-99671-collection-20.html, /webPlay/play-id-99671-collection-19.html, /webPlay/play-id-99671-collection-18.html, /webPlay/play-id-99671-collection-17.html, /webPlay/play-id-99671-collection-16.html, /webPlay/play-id-99671-collection-15.html, /webPlay/play-id-99671-collection-14.html, /webPlay/play-id-99671-collection-13.html, /webPlay/play-id-99671-collection-12.html, /webPlay/play-id-99671-collection-11.html, /webPlay/play-id-99671-collection-10.html, /webPlay/play-id-99671-collection-9.html, /webPlay/play-id-99671-collection-8.html, /webPlay/play-id-99671-collection-7.html, /webPlay/play-id-99671-collection-6.html, /webPlay/play-id-99671-collection-5.html, /webPlay/play-id-99671-collection-4.html, /webPlay/play-id-99671-collection-3.html, /webPlay/play-id-99671-collection-2.html, /webPlay/play-id-99671-collection-1.html, /webPlay/play-id-99671-collection-0.html, ftp://g:[email protected]:2166/一千零一夜35.mp4, ftp://g:[email protected]:2166/一千零一夜34.mp4, ftp://g:[email protected]:2166/一千零一夜33.mp4, ftp://g:[email protected]:2166/一千零一夜32.mp4, ftp://g:[email protected]:2166/一千零一夜31.mp4, ftp://g:[email protected]:2166/一千零一夜30.mp4, ftp://g:[email protected]:2166/一千零一夜29.mp4, ftp://g:[email protected]:2166/一千零一夜28.mp4, ftp://g:[email protected]:2166/一千零一夜27.mp4, ftp://g:[email protected]:2166/一千零一夜26.mp4, ftp://g:[email protected]:2166/一千零一夜25.mp4, ftp://g:[email protected]:2166/一千零一夜24.mp4, ftp://g:[email protected]:2166/一千零一夜23.mp4, ftp://g:[email protected]:2166/一千零一夜22.mp4, ftp://g:[email protected]:2166/一千零一夜21.mp4, ftp://g:[email protected]:2166/一千零一夜20.mp4, ftp://g:[email protected]:2166/一千零一夜19.mp4, ftp://g:[email protected]:2166/一千零一夜18.mp4, ftp://g:[email protected]:2166/一千零一夜17.mp4, ftp://g:[email protected]:2166/一千零一夜16.mp4, ftp://g:[email protected]:2166/一千零一夜15.mp4, ftp://g:[email protected]:2166/一千零一夜14.mp4, ftp://g:[email protected]:2166/一千零一夜13.mp4, ftp://g:[email protected]:2166/一千零一夜12.mp4, ftp://g:[email protected]:2166/一千零一夜11.mp4, ftp://g:[email protected]:2166/一千零一夜10.mp4, ftp://g:[email protected]:2166/一千零一夜09.mp4, ftp://g:[email protected]:2166/一千零一夜08.mp4, ftp://g:[email protected]:2166/一千零一夜07.mp4, ftp://g:[email protected]:2166/一千零一夜06.mp4, ftp://g:[email protected]:2166/一千零一夜05.mp4, ftp://g:[email protected]:2166/一千零一夜04.mp4, ftp://g:[email protected]:2166/一千零一夜03.mp4, ftp://g:[email protected]:2166/一千零一夜02.mp4, ftp://g:[email protected]:2166/一千零一夜01.mp4, /i/99743.html, /i/99734.html, /i/99733.html, /i/99725.html, /i/99720.html, /i/99719.html, /i/99716.html, /i/99708.html, /i/99704.html, /i/99695.html, /i/97129.html, /i/97575.html, /i/97041.html, /i/92091.html, /i/97637.html, /i/92020.html, /i/95187.html, /i/92000.html, /i/98343.html, /i/97363.html]

不想自己写正则表达式,可以使用第三方爬虫框架,这方面网上找很多的,我就不写了。

猜你喜欢

转载自blog.51cto.com/yuqian2203/2145120
今日推荐