These are chat archives for beniz/deepdetect

12th
May 2017
Ravi Kiran
@raaka1
May 12 2017 04:44
@roysG try tar -xvf xx.tar
roysG
@roysG
May 12 2017 05:50
Ok thanks.
I am trying to use in caffemodel taken from this site
But i get error something wrong with the "nclasses"
Please assist me, thanks
Ravi Kiran
@raaka1
May 12 2017 07:32
@roysG can you post error in gist and link it here
roysG
@roysG
May 12 2017 10:19

First i add the sevice:

curl -X PUT "http://localhost:8080/services/getAge" -d '{
"mllib":"caffe",
"description":"clothes classification 1 age",
"type":"supervised",
"parameters":{
"input":{
"connector":"image"
},
"mllib":{
"nclasses":101
}
},
"model":{
"repository":"/home/xxx/models/ages/imdbAge"
}
}'

I got created and then i made the POST request:

curl -X POST "http://localhost:8080/predict" -d '{
"service":"getAge",
"parameters":{
"output":{
"best":5
}
},
"data":["https://scontent-frx5-1.cdninstagram.com/t51.2885-19/s600x600/17818623_622530944596656_5608228133154062336_a.jpg"]
}'

result in the client:
{"status":{"code":400,"msg":"BadRequest","dd_code":1006,"dd_msg":"Service Bad Request Error"}} xxx-MacBook-Pro:xxx$

result in the server:
E0512 10:16:12.825335 16408 caffelib.cc:1041] Error creating model for prediction

ERROR - 10:16:12 - service cc mllib bad param: no deploy file in /home/xxx/models/ages/imdbAge for initializing the net

ERROR - 10:16:12 - Fri May 12 10:16:12 2017 UTC - 84.108.1.81 "POST /predict" 400 1

What do i missing?
Oh sorry, in accident i put here all the bug, i wll move it to gist
roysG
@roysG
May 12 2017 10:24
I create bug in git, link:
beniz/deepdetect#308
Emmanuel Benazera
@beniz
May 12 2017 10:32
it's no bug @roysG just read what the server tells you (missing deploy file), read the FAQ etc...
Emmanuel Benazera
@beniz
May 12 2017 11:19
look carefully at the provided models, and just repeat the same structure of files.
roysG
@roysG
May 12 2017 14:33

Ok, the deploy file was there but in different name, not i changed it to deploy.
When i tried to create new server, (this time with the file deploy) i got error:

INFO - 14:31:14 - Initializing net from parameters:

E0512 14:31:14.464174 16407 caffelib.cc:394] Error creating network
ERROR - 14:31:14 - Unknown bottom blob 'data' (layer 'conv1_1', bottom index 0)
ERROR - 14:31:14 - service creation call failed

ERROR - 14:31:14 - Fri May 12 14:31:14 2017 UTC - 84.108.1.81 "PUT /services/c22" 500 117

*not = now

and in the client i got error:

{"status":{"code":500,"msg":"InternalError","dd_code":1007,"dd_msg":"./include/caffe/llogging.h:153 / Fatal Caffe error"}}

Emmanuel Benazera
@beniz
May 12 2017 14:36
read the FAQ regarding model import and look at the template deploy.prototxt in the repository to fix yours.
roysG
@roysG
May 12 2017 14:38
Ok, does it found on the site or github?
roysG
@roysG
May 12 2017 14:46
I am new in the system and want to learn it, if you can assist me and give the links that i can read the differences it will be very kind from you, thanks
roysG
@roysG
May 12 2017 15:01

@beniz , I saw the method that the deploy file is working ("VGG_ILSVRC_16_layers") and replace the contents.

I tried to do create server and now it created, but when i send the POST this is the result i get:

{"status":{"code":200,"msg":"OK"},"head":{"method":"/predict","service":"t2","time":61320.0},"body":{"predictions":[{"uri":"https://scontent-frx5-1.cdninstagram.com/t51.2885-19/s600x600/17818623_622530944596656_5608228133154062336_a.jpg","classes":[{"prob":0.0010000000474974514,"cat":"0"},{"prob":0.0010000000474974514,"cat":"1"},{"prob":0.0010000000474974514,"cat":"2"},{"prob":0.0010000000474974514,"cat":"3"},{"prob":0.0010000000474974514,"last":true,"cat":"4"}]}]}}

Do i miss something?

Emmanuel Benazera
@beniz
May 12 2017 15:41
read your imdb model doc carefully, if cat is the age, it probably has many more categories you can get values from with the best:-1 parameter
roysG
@roysG
May 12 2017 16:21
How can i add the start the server on reload server?
roysG
@roysG
May 12 2017 16:29
And also how to save the models i created even when i stop the server?
roysG
@roysG
May 12 2017 17:10

I try to send the request from node js, but i get 400, bad request

Important to mention that in the curl everything works great.

The code:

needle.post('http://localhost:8080/predict',{
            "service":"getAge",
            "parameters":{
                "output":{
                    "best":5
                }
            },
            "data":[
                "https://scontent-frx5-1.cdninstagram.com/t51.2885-19/11856752_515409661959330_1395087449_a.jpg"
            ]
        },function(err,data) {
            console.log(err,data)
        })
roysG
@roysG
May 12 2017 17:33
?
Emmanuel Benazera
@beniz
May 12 2017 19:29
check your http headers.
roysG
@roysG
May 12 2017 22:28
How can i speed up the response of the result?
headers: { 'content-length': '42', 'content-type': 'application/json' },
This is what i get in the http header: