These are chat archives for beniz/deepdetect

2nd
Feb 2016
Dawid Wolski
@merito
Feb 02 2016 15:25
Hi, I'm trying to run an example from image classifier tutorial. I've created a service and received an answer from server
INFO - source=../../templates/caffe/googlenet/ INFO - dest=../../models/buildings/imgnet/googlenet.prototxt
and client
{"status":{"code":201,"msg":"Created"}}
but next trying to predict an image server prints this
and returns
$ curl -X POST "http://localhost:8080/predict" -d "{\"service\":\"test2\",\"parameters\":{\"input\":{\"width\":224,\"height\":224},\"output\":{\"best\":3}},\"data\":[\"ambulance.jpg\"]}" {"status":{"code":200,"msg":"OK"},"head":{"method":"/predict","time":3519.0,"service":"test2"},"body":{"predictions":{"uri":"ambulance.jpg","classes":{"last":true,"prob":0.00009999999747378752,"cat":"n01440764 tench, Tinca tinca"}}}}
Sorry, the first answer from server is
$ curl -X POST "http://localhost:8080/predict" -d "{\"service\":\"test2\",\"parameters\":{\"input\":{\"width\":224,\"height\":224},\"output\":{\"best\":3}},\"data\":[\"ambulance.jpg\"]}" {"status":{"code":500,"msg":"InternalError","dd_code":1007,"dd_msg":"/home/devadmin/deepdetect/deepdetect/build/caffe_dd/src/caffe_dd/include/caffe/llogging.h:66 / Fatal Caffe error"}}
but the every next answer is like in my first message. It returns always the class with id 0 from corresp.txt. I've tried with imgnet, buildings and furnitures models, the same result.
Emmanuel Benazera
@beniz
Feb 02 2016 15:54
weird
what configuration are you on, and memory / GPU specs ?
I'll look at it again in 30 mins, must go for now
Emmanuel Benazera
@beniz
Feb 02 2016 16:26
@merito I believe you are mixing weights and model (deploy.prototxt) from two different models
Emmanuel Benazera
@beniz
Feb 02 2016 17:09
you can actually find out easily by listing the content of your model repository ../../models/buildings/imgnet