Data upload and dataset creation

Data upload and dataset creation

I tried to upload a 40GB .zip of images using the upload feature, but upon finishing the upload it failed to unpack the zip and it deleted the data. I then uploaded the zip using FTP to the uploads folder and tried to create a dataset, but the application froze, there is no way to cancel the operation so I had to reboot.

I then tried to create the dataset with separate validation and testing data,  the application processed only the testing data and then stated the dataset was complete, the training and validation data did not appear to have been included.

Is there any other way to input very large datasets into the application?

 

4 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

Hi Stephen,

Can you please make sure there is enough disk space on the server (where the application runs) in order to complete the operation and let us know?

Thanks

Nir

I reinstalled the application, made sure I had plenty of free disk and tried to upload the zip again. It worked but this time but it took a very long time to unpack the zip file, much longer than if I unpacked it as a terminal command. 

I still am facing the problem that when I create the dataset and provide a separate path for validation and testing data, the process only adds the validation data to the dataset. When I try to use it for training it returns an error no data.

 

 

Thanks for the update Stephen,

Let's try to figure out why the training dataset isn't building.

Can you please run the following commands in order to see the log, and paste it here?

Thanks!

1
docker exec $(docker ps | grep intelcorp/dl-training-tool:caffe | awk '{print $1}') bash -c "cat /var/log/dlsdk/dlsdk.log"

2
docker exec $(docker ps | grep intelcorp/dl-training-tool:tf | awk '{print $1}') bash -c "cat /var/log/dlsdk/tf/log.log"

Leave a Comment

Please sign in to add a comment. Not a member? Join today