python - Solution to storing 300MB in memory for Google App Engine -


i using google app engine in python. have 5000 people in database. entire list of 5000 people objects takes 300 mb of memory.

i have been trying store in memory using blobcache, module written [here][1].

i running pickle "outofmemory" issues, , looking solution involves storing these 5000 objects database, , retrieving them @ once.

my person model looks this.

class persondb(db.model):     serialized = db.blobproperty()     pid = db.stringproperty() 

each person object has many attributes , methods associated it, decided pickle each person object , store serialized field. pid allows me query person id. person looks this

class person():     def __init__(self, sex, mrn, age):        self.sex = sex;        self.age = age; #exact age        self.record_number = mrn;        self.locations = [];      def makeagegroup(self, agestr):        ageg = agestr        return int(ageg)      def addlocation(self, healthdistrict):         self.locations.append(healthdistrict)  

when store 5000 people @ once database, server 500 error. know why? code follows:

   #people list of 5000 people objects def write_people(self, people):     person in people:         persondb = persondb()         persondb.serialized = pickle.dumps(person)         persondb.pid = person.record_number         persondb.put() 

how retrieve 5000 of these objects @ once in app engine method?

my idea this

def get_patients(self):     #get list of 5000 people database     people_from_db = db.gqlquery("select * persondb")     people = []     person in people_from_db:         people.append(pickle.loads(person.serialized)) 

thanks in advance, i've been stuck on while!!

you should not have 5000 users in memory @ once. retrieve 1 need.


Comments

Popular posts from this blog

c# - SharpSVN - How to get the previous revision? -

c++ - Is it possible to compile a VST on linux? -

url - Querystring manipulation of email Address in PHP -