小编典典

HttpConnection-javax.microedition,为getLength()方法返回-1

java

我正在尝试用Java编写一个非常简单的移动应用程序(J2ME)。这个想法是通过URL输入访问网站并将网站内容读入缓冲区。

这是问题所在。这对于某些URL而言是完美的,但对于其他URL而言则不是吗?下面的示例(维基百科)可以正常工作。但是以“
http://java.com/en/about/”为例,“ HttpConnection
hc”为getLenght()返回-1,因此没有内容要读入缓冲区吗?

这是我的代码:

        String url = "http://en.wikipedia.org/wiki/RSS";

        //Sets up HttpConnection and InputStream using the URL variable
        HttpConnection hc = null;
        InputStream is = null;

        try {
            hc = (HttpConnection) Connector.open(url);
            is = hc.openInputStream();
        } catch (IOException ie) {
            System.out.println(ie.getMessage());
        }

        //Reader object created to read input from InputStream
        Reader rdr = new InputStreamReader(is);

        //Variable "content" will store HTML code
        String content = "";

        //Get the lenght of the data to set the buffer sizes
        int len = (int) hc.getLength();

有任何想法吗?让我知道我是否错过任何事情!

仅供参考,我正在使用Netbeans 6.9.1

HttpConnection的库是“
javax.microedition.io.HttpConnection;”。+“导入javax.microedition.io.Connector;”


阅读 209

收藏
2020-11-26

共1个答案

小编典典

来自java.com的HTTP响应是

HTTP/1.1 200 OK
Server: Sun-Java-System-Web-Server/7.0
Date: Wed, 23 Feb 2011 11:07:44 GMT
Content-Type: text/html;charset=UTF-8
Set-Cookie: JSESSIONID=B62F3DFB233BB2806018EC721F6C3FD7; Path=/
Content-Encoding: gzip
Vary: accept-encoding
Transfer-Encoding: chunked

维基百科的HTTP响应是

HTTP/1.0 200 OK
Date: Wed, 23 Feb 2011 10:18:56 GMT
Server: Apache
Cache-Control: private, s-maxage=0, max-age=0, must-revalidate
Content-Language: en
Vary: Accept-Encoding,Cookie
Last-Modified: Fri, 18 Feb 2011 00:23:59 GMT
Content-Encoding: gzip
Content-Length: 24905
Content-Type: text/html; charset=UTF-8
Age: 2984
X-Cache: HIT from sq61.wikimedia.org, MISS from sq38.wikimedia.org
X-Cache-Lookup: HIT from sq61.wikimedia.org:3128, MISS from sq38.wikimedia.org:80
Connection: keep-alive

如您所见,http://java.com/en/about/的HTTP响应不包含Content-
Length标头,内容是分块的。

因此,getLength()返回-1。

2020-11-26