tsichevski

Members
  • Content count

    41
  • Joined

  • Last visited

Community Reputation

0 Neutral

About tsichevski

  • Rank
    Advanced Member

Recent Profile Visitors

2,184 profile views
  1. > We think we fixed the problem with access to attachments. Could you try it? Yes, thank you, now I can download the patch.
  2. Hi perstmco, When trying to access the ignore.patch, got this error page: Sorry, there is a problem The page you are trying to access is not available for your account. Error code: 2C171/1 Contact Us
  3. Hi, I use the Java version of perst. How shall I re-defining the classloader can solve my problem? Regards, Vladimir
  4. I have a class with persistent instances. I need to rename or remove it it. I removed all persistent instances. But my application does not start trying to load by name the class registered with the now obsolete ClassDescriptor for this class. Is it possible to tolerate the situation the class can not be loaded somehow? Regards, Vladimir
  5. Hi, any news on a new release? Thanks, Vladimir
  6. Great! Thanks!
  7. When do you plan to release this new version of Perst? Will it be announced here? Regards, Vladimir
  8. Thanks you! Do you plan to include this patch in the next release of Perst? Regards, Vladimir
  9. I do not organize my own collections. This example I sent just demonstrates how to break the program with relatively short linked list. In my 'real' application I manage a few tree-like structures with multiple mutual links between nodes in different trees. Any node may have an arbitrary (though usually small, 0 or 1 link) number of links to nodes in other trees. Links are implemented as Set objects. In this model I can easily imagine Perst can travel a long distance through the links between the trees, and I can currently do nothing about it. I think, it is possible to use some Java collection to compute the set of reachable objects instead of using recursion which relies on JVM stack.
  10. Here is the simple example: a linked list. The link class: package testperst; import org.eclipse.jdt.annotation.NonNull; import org.eclipse.jdt.annotation.Nullable; import org.garret.perst.PinnedPersistent; import org.garret.perst.Storage; public class Link extends PinnedPersistent { public Link(@NonNull Storage storage, @Nullable Link next) { super(storage); this.next = next; } public Link getNext() { return next; } private final @Nullable Link next; } and the test method resulting in the StackOverflow @Test public void testLongLinkedList() throws Exception { File file = new File(TEST_DBS); file.delete(); Storage db = getStorage(); assert db != null; Link link = new Link(db, null); for(int i = 0; i < 2000; i++) { link = new Link(db, link); } db.setRoot(link); db.close(); } The error appears if the number of links more than 1200. I do not adjust the stack size.
  11. I'm afraid, I could not extract the 'model' part and loading code from the real application. Probably, I can devise a simple test example.
  12. Is it possible to implement the Set on top of the PersistentHashImpl using the keys and ignoring the values? This would be exactly the same way the java.util.HashSet is implemented. It shall not require object fetching during the search.
  13. IMHO this is not a very good solution. One of the most attractive Perst feature is that the application should not know in advance whether an object will persist or not until the transaction is committed. Also it should not (and could not) know how big the transaction will be. Also, splitting a transaction breaks the basic transaction logic: either all changes must be applied or no changes at all.
  14. No, what I need is quite opposite. In my application model I use sets of objects which are supposed to be compared by their natural keys, not by OID. So two objects are EQUAL if their key parts are equal, regardless of are these objects persistent or not. In pseudo-code it looks like the following: class MyObject { OtherObject keyPart1; String keyPart2; equals(MyObject other) { return other.keyPart1.equals(keyPart1) && keyPart2.equals(keyPart2); } hashCode() { return keyPart1.hashCode() * 31 + keyPart2.hashCode(); } } Set persistentSet = Storage.createTruePersistentHashSet(storage); OtherObject keyPart1 = new OtherObject(...); String keyPart2 = "someString"; persistentSet.add(new MyObject(keyPart1, keyPart2)); // Must return TRUE, must NOT make the new MyObject persistent persistentSet.contains(new MyObject(keyPart1, keyPart2));
  15. Hi, may be, I did not make myself clear. The problem appears when I am creating the database, i.e. when I am initially storing too many just created in-memory objects to the DB file, not when I am loading data from DB file back to memory. And yes, the data model I use matches your description, so I had no problem reading data. May be, a different solution may be used for traversing objects: store objects waiting for serialization in a java collection instead of Java VM stack? Regards, Vladimir