⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 persistentpagefile.java

📁 一个java实现的R树
💻 JAVA
字号:
package rtree;

import java.io.*;
import java.util.*;

/**
 * A page file that stores all pages into Persistent storage. It uses a RandomAccessFile to store node data.
 * The format of the page file is as follows. First, a header is writen that stores important information
 * about the RTree. The header format is as shown below:
 * <br>
 * &nbsp;&nbsp;int dimension<br>
 * &nbsp;&nbsp;float fillFactor<br>
 * &nbsp;&nbsp;int nodeCapacity<br>
 * &nbsp;&nbsp;int pageSize<br>
 * &nbsp;&nbsp;int treeType<br>
 * <p>
 * All the pages are stored after the header, with the following format:
 * <br>
 * &nbsp;&nbsp;int parent<br>
 * &nbsp;&nbsp;int level<br>
 * &nbsp;&nbsp;int usedSpace<br>
 * &nbsp;&nbsp;// HyperCubes<br>
 * &nbsp;&nbsp;for (i in usedSpace)<br>
 * &nbsp;&nbsp;&nbsp;&nbsp;for (j in dimension) {<br>
 * &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;float p(i)1 [j]<br>
 * &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;float p(i)2 [j]<br>
 * &nbsp;&nbsp;&nbsp;&nbsp;}<br>
 * &nbsp;&nbsp;&nbsp;&nbsp;int branch<br>
 * &nbsp;&nbsp;}
 * <p>
 * Deleted pages are stored into a Stack. If a new entry is inserted it is placed in the last deleted page.
 * That way the page file does not grow for ever.
 * <p>
 * Created: Sun Oct 24 21:14:57 1999
 * <p>
 * @author Hadjieleftheriou Marios
 * @version 1.001
 */
public class PersistentPageFile extends PageFile {
    /** Stores node data into Persistent storage. */
    private RandomAccessFile file;
    private String fileName;

    private Stack emptyPages = new Stack();

    /**
     * Header size calculated using the following formula:
     *   headerSize = dimension + fillFactor + nodeCapacity + pageSize + treeType
     */
    private int headerSize = 20;

    public static final int EMPTY_PAGE = -2;

    public PersistentPageFile() {
	this(null);
    }

    public PersistentPageFile(String fileName) {
	try {
	    if (fileName == null) {
		File f = File.createTempFile("rtree", ".dat");
		this.fileName = f.getCanonicalPath();
		f.deleteOnExit();
		file = new RandomAccessFile(f, "rw");
	    } else {
		file = new RandomAccessFile(fileName, "rw");
		this.fileName = fileName;

		file.seek(0);
		byte[] header = new byte[headerSize];
		if (headerSize == file.read(header)) {
		    DataInputStream ds = new DataInputStream(new ByteArrayInputStream(header));
		    dimension = ds.readInt();
		    fillFactor = ds.readFloat();
		    nodeCapacity = ds.readInt();
		    pageSize = ds.readInt();
		    treeType = ds.readInt();

		    // find all empty pages and add them into the emptyPages stack.
		    int i = 0;
		    try {
			while (true) {
			    if (EMPTY_PAGE == file.readInt()) {
				emptyPages.push(new Integer(i));
			    }
			    i++;
			    file.seek(headerSize + i * pageSize);
			}
		    } catch (IOException e) {
		    }
		}
	    }
	} catch (Exception e) {
	    e.printStackTrace();
	}
    }

    protected void initialize(RTree tree, int dimension, float fillFactor, int capacity, int treeType) {
	super.initialize(tree, dimension, fillFactor, capacity, treeType);
	emptyPages.clear();
       
	try {
	    // FIXME: needs Java 2 to compile.
	    file.setLength(0);
	    //
	    file.seek(0);
	    file.writeInt(dimension);
	    file.writeFloat(fillFactor);
	    file.writeInt(nodeCapacity);
	    file.writeInt(pageSize);
	    file.writeInt(treeType);
	} catch (IOException e) {
	    e.printStackTrace();
	}
    }

    protected void finalize() throws Throwable {
	try {
	    file.close();
	} catch (Exception e) {
	    e.printStackTrace();
	}

	super.finalize();
    }

    /*
    protected byte[] readData(int page) throws PageFaultError {
	if (page < 0) {
	    throw new IllegalArgumentException("Page number cannot be negative.");
	}

	try {
	    file.seek(headerSize + page * pageSize);

	    byte[] b = new byte[pageSize];
	    if (-1 == file.read(b)) {
		throw new PageFaultError("EOF found while loading page " + page + ".");
	    }

	    return b;
	} catch (IOException e) {
	    e.printStackTrace();
	    return null;
	}
    }

    protected int writeData(byte[] d, int page) throws PageFaultError {
	if (d == null) {
	    throw new IllegalArgumentException("Data cannot be null.");
	}

	if (d.length > pageSize) {
	    throw new IllegalArgumentException("Byte array length must be at most equal to one page size.");
	}

	try {
	    if (page < 0) {
		if (emptyPages.empty()) {
		    page = (int) ((file.length() - headerSize) / pageSize);
		} else {
		    page = ((Integer) emptyPages.pop()).intValue();
		}
	    }

	    file.seek(headerSize + page * pageSize);
	    file.write(d);
	
	    return page;
	} catch (IOException e) {
	    e.printStackTrace();
	    return -1;
	}
    }
    */

    protected AbstractNode readNode(int page) throws PageFaultError {
	if (page < 0) {
	    throw new IllegalArgumentException("Page number cannot be negative.");
	}

	try {
	    file.seek(headerSize + page * pageSize);

	    byte[] b = new byte[pageSize];
	    int l = file.read(b);
	    if (-1 == l) {
		throw new PageFaultError("EOF found while trying to read page " + page + ".");
	    }

	    DataInputStream ds = new DataInputStream(new ByteArrayInputStream(b));

	    int parent = ds.readInt();
	    if (parent == EMPTY_PAGE) {
		throw new PageFaultError("Page " + page + " is empty.");
	    }

	    int level = ds.readInt();
	    int usedSpace = ds.readInt();

	    AbstractNode n;
	    if (level != 0) {
		n = new Index(tree, parent, page, level);
	    } else {
		n = new Leaf(tree, parent, page);
	    }

	    n.parent = parent;
	    n.level = level;
	    n.usedSpace = usedSpace;

	    float[] p1 = new float[dimension];
	    float[] p2 = new float[dimension];

	    for (int i = 0; i < usedSpace; i++) {
		for (int j = 0; j < dimension; j++) {
		    p1[j] = ds.readFloat();
		    p2[j] = ds.readFloat();
		}

		n.data[i] = new HyperCube(new Point(p1), new Point(p2));
		n.branches[i] = ds.readInt();
	    }

	    return n;
	} catch (IOException e) {
	    e.printStackTrace();
	    return null;
	}
    }

    protected int writeNode(AbstractNode n) throws PageFaultError {
	if (n == null) {
	    throw new IllegalArgumentException("Node cannot be null.");
	}

	try {
	    int page;
	    if (n.pageNumber < 0) {
		if (emptyPages.empty()) {
		    page = (int) ((file.length() - headerSize) / pageSize);
		} else {
		    page = ((Integer) emptyPages.pop()).intValue();
		}
		n.pageNumber = page;
	    } else {
		page = n.pageNumber;
	    }

	    ByteArrayOutputStream bs = new ByteArrayOutputStream(pageSize);
	    DataOutputStream ds = new DataOutputStream(bs);
	    ds.writeInt(n.parent);
	    ds.writeInt(n.level);
	    ds.writeInt(n.usedSpace);

	    for (int i = 0; i < tree.getNodeCapacity(); i++) {
		for (int j = 0; j < tree.getDimension(); j++) {
		    if (n.data[i] == null) {
			ds.writeFloat(Float.NaN);
			ds.writeFloat(Float.NaN);
		    } else {
			ds.writeFloat(n.data[i].getP1().getFloatCoordinate(j));
			ds.writeFloat(n.data[i].getP2().getFloatCoordinate(j));
		    }
		}
		ds.writeInt(n.branches[i]);
	    }
	    ds.flush();
	    bs.flush();

	    file.seek(headerSize + page * pageSize);
	    file.write(bs.toByteArray());

	    return page;
	} catch (IOException e) {
	    e.printStackTrace();
	    return -1;
	}
    }

    protected AbstractNode deletePage(int page) throws PageFaultError {
	try {
	    if (page < 0 || page > (file.length() - headerSize) / pageSize) {
		return null;
	    } else {
		AbstractNode n = readNode(page);
		file.seek(headerSize + page * pageSize);
		file.writeInt(EMPTY_PAGE);
		emptyPages.push(new Integer(page));
		return n;
	    }
	} catch (IOException e) {
	    e.printStackTrace();
	    return null;
	}
    }

} // PersistentPageFile

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -