Collective Knowledge Aggregator
proof-of-concept
Add/update entry:
[
Back to CK browser
]
Module/class:
(under development) auto/crowd-tune CUDA work size (execution time)
(under development) auto/crowd-tune OpenCL local work size (execution time)
(under development) auto/crowd-tune OpenCL local work size (execution time/FPS vs energy)
(under development) crowdsource OpenCL bug detection
(under development) crowdsource modeling of program behavior
(under development) crowdsource program numerical stability
(under development) crowdsource program scalability
Collaborative Program Optimization using mobile devices
advice
ae
ae.person.table
ai-artifact
algorithm
all
announcements.funding
announcements.job
apk
artifact
auto/crowd-tune GCC compiler flags (custom dimensions)
auto/crowd-tune GCC compiler flags (do not degrade execution time, do not degrade code size)
auto/crowd-tune GCC compiler flags (minimize execution time and code size)
auto/crowd-tune GCC compiler flags (minimize execution time)
auto/crowd-tune GCC compiler flags (minimize execution time, do not degrade code size)
auto/crowd-tune GCC compiler flags (minimize total binary size, do not degrade execution time)
auto/crowd-tune LLVM compiler flags (do not degrade execution time, do not degrade code size)
auto/crowd-tune LLVM compiler flags (minimize execution time)
auto/crowd-tune LLVM compiler flags (minimize execution time, do not degrade code size)
auto/crowd-tune OpenCL-based CLBlast (GFLOPs)
autotune custom pipeline dimensions
award
caffe
caffe2
cbricks
cfg
challenge.vqe
choice
class
clblast
cmdgen
compiler
crowd-benchmark DNN libraries and models
crowd-benchmark DNN libraries and models (Caffe - dev)
crowd-benchmark DNN libraries and models (Caffe2)
crowd-benchmark DNN libraries and models (TensorFlow)
crowd-benchmark DNN libraries and models (dividiti desktop app)
crowd-benchmark DNN libraries and models using mobile devices
crowd-benchmark shared workloads via ARM WA framework
crowd-test OpenCL compilers (beta) - crowdsource bug detection via CK
crowd-test OpenGL compilers (beta)
crowdnode
dashboard
dataset
dataset.features
demo
device (deprecated or not used)
dissemination.announcement
dissemination.book
dissemination.conference
dissemination.event
dissemination.hardware
dissemination.journal
dissemination.keynote
dissemination.lecture
dissemination.patent
dissemination.pitfall
dissemination.poster
dissemination.presentation
dissemination.press-release
dissemination.publication
dissemination.publication.artifact
dissemination.repo
dissemination.soft
dissemination.workshop
docker
env
experiment
experiment.raw
experiment.scenario.android
experiment.user
experiment.view
explore DNN batch size
explore GCC compiler flags
explore LLVM compiler flags
explore OpenBLAS number of threads
explore OpenMP number of threads
explore compiler flags
fuzz GCC compiler flags (search for bugs)
fuzz LLVM compiler flags (search for bugs)
gemmbench.crowdtuning
graph
graph.dot
hackathon.20180615
hackathon.20181006
hackathon.20190127
hackathon.20190315
index
jnotebook
kernel
log
machine
math.conditions
math.frontier
math.variation
me
milepost
misc
mlperf
mlperf.inference
mlperf.mobilenets
model
model.image.classification
model.r
model.sklearn
model.species
model.tensorflowapi
model.tf
module
nntest
open ReQuEST @ ASPLOS'18 tournament (Pareto-efficient image classification)
organization
os
package
person
photo
pipeline
pipeline.cmd
platform
platform.cpu
platform.dsp
platform.gpgpu
platform.gpu
platform.init
platform.nn
platform.npu
platform.os
proceedings.acm
program
program.behavior
program.dynamic.features
program.experiment.speedup
program.optimization
program.output
program.species
program.static.features
qml
qr-code
repo
report
research.topic
result
scc-workflow
script
slide
soft
solution
sut
table
tensorflow
test
tmp
user
video
vqe
wa
wa-device
wa-params
wa-result
wa-scenario
wa-tool
web
wfe
xml
Repository:
CK (machine learning based) multi-objective autotuning
CK analytics
CK crowdtuning (crowdsourcing autotuning)
CK dissemination modules
CK repository to crowdsource optimization of benchmarks, kernels and realistic workloads across Raspberry Pi devices provided by volunteers (starting from compiler flag autotuning)
CK web
Large and shared artifacts (HOG experiments) to reproduce CK paper
Reproducible and interactive papers with all shared artifacts for our CK papers
Reproducing PAMELA project (medium data set (20 frames) for slambench) via CK
Reproducing PAMELA project (slambench analysis and crowd-tuning) via CK
Tool clsmith converted to CK format
cTuning datasets (min)
cTuning programs
cbricks
ck-ai
ck-artifact-evaluation
ck-assets
ck-caffe
ck-caffe2
ck-cntk
ck-crowd-scenarios
ck-crowdsource-dnn-optimization
ck-crowdtuning-platforms
ck-dev-compilers
ck-dissemination
ck-docker
ck-env
ck-experiments
ck-graph-analytics
ck-math
ck-mlperf
ck-mlperf-sysml-demo-20190402
ck-mxnet
ck-nntest
ck-nntest-20181001
ck-qiskit
ck-quantum
ck-quantum-challenge-vqe
ck-quantum-hackathon-20180615
ck-quantum-hackathon-20181006
ck-quantum-hackathon-20190127
ck-quantum-hackathon-20190315
ck-quantum-hackathons
ck-request
ck-request-asplos18-caffe-intel
ck-request-asplos18-iot-farm
ck-request-asplos18-mobilenets-armcl-opencl
ck-request-asplos18-mobilenets-tvm-arm
ck-request-asplos18-resnet-tvm-fpga
ck-request-asplos18-results
ck-request-asplos18-results-caffe-intel
ck-request-asplos18-results-iot-farm
ck-request-asplos18-results-mobilenets-armcl-opencl
ck-request-asplos18-results-mobilenets-tvm-arm
ck-request-asplos18-results-resnet-tvm-fpga
ck-rigetti
ck-rpi-optimization-results
ck-scc
ck-scc18
ck-tensorflow
ck-tensorrt
ck-tvm
ck-wa
ck-wa-extra
ck-wa-workloads
ck-website
ctuning-datasets
default
gemmbench
local
mlperf-mobilenets
reproduce-carp-project
reproduce-milepost-project
shader-compiler-bugs
upload
Alias:
(UID):
User-friendly name:
Tags:
Author:
Author email:
Author web-page:
Entry copyright:
Entry license:
Meta-description in JSON:
{ "all_raw_results": [ { "behavior_uid": "880042468a8620fc", "cpu_freqs_after": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 } ], "cpu_freqs_before": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 } ], "image_height": 3024, "image_width": 4032, "prediction": "0.9835 - \"n03793489 mouse, computer mouse\"\n0.0042 - \"n04548280 wall clock\"\n0.0015 - \"n03532672 hook, claw\"\n0.0013 - \"n02988304 CD player\"\n0.0013 - \"n04317175 stethoscope\"\n", "time": [ 6469, 6860, 6842 ], "user": "", "xopenme": { "execution_time": [ 4.668767, 4.77017, 4.719789 ], "execution_time_kernel_0": [ 4.668767, 4.77017, 4.719789 ], "execution_time_kernel_1": [ 0.500118, 0.504171, 0.507339 ], "execution_time_kernel_2": [ 1.160956, 1.480802, 1.488871 ] } }, { "behavior_uid": "2e695ef505c32c9a", "cpu_freqs_after": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "cpu_freqs_before": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "image_height": 168, "image_width": 234, "mispredictions": [ { "correct_answer": "Stonehenge", "mispredicted_image": "misprediction-image-ebfdeeaae2a4d29b.jpg", "misprediction_results": "0.5988 - \"n04428191 thresher, thrasher, threshing machine\"\n0.1070 - \"n04604644 worm fence, snake fence, snake-rail fence, Virginia fence\"\n0.0546 - \"n03496892 harvester, reaper\"\n0.0316 - \"n03764736 milk can\"\n0.0230 - \"n03384352 forklift\"\n" } ], "prediction": "0.5988 - \"n04428191 thresher, thrasher, threshing machine\"\n0.1070 - \"n04604644 worm fence, snake fence, snake-rail fence, Virginia fence\"\n0.0546 - \"n03496892 harvester, reaper\"\n0.0316 - \"n03764736 milk can\"\n0.0230 - \"n03384352 forklift\"\n", "time": [ 7147, 7166, 6751 ], "user": "", "xopenme": { "execution_time": [ 5.693378, 5.729819, 5.653549 ], "execution_time_kernel_0": [ 5.693378, 5.729819, 5.653549 ], "execution_time_kernel_1": [ 0.002381, 0.002362, 0.002378 ], "execution_time_kernel_2": [ 1.34486, 1.335148, 0.992131 ] } }, { "behavior_uid": "93bbe48056e555e9", "cpu_freqs_after": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "cpu_freqs_before": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "image_height": 960, "image_width": 1120, "prediction": "0.5897 - \"n02814860 beacon, lighthouse, beacon light, pharos\"\n0.2755 - \"n02894605 breakwater, groin, groyne, mole, bulwark, seawall, jetty\"\n0.0390 - \"n09399592 promontory, headland, head, foreland\"\n0.0251 - \"n02980441 castle\"\n0.0231 - \"n09332890 lakeside, lakeshore\"\n", "time": [ 6833, 6647, 6872, 6899, 6051, 5879 ], "user": "", "xopenme": { "execution_time": [ 5.613945, 5.592732, 5.594027, 5.563995, 4.648931, 4.629759 ], "execution_time_kernel_0": [ 5.613945, 5.592732, 5.594027, 5.563995, 4.648931, 4.629759 ], "execution_time_kernel_1": [ 0.050073, 0.048114, 0.049178, 0.042273, 0.042275, 0.042704 ], "execution_time_kernel_2": [ 1.075827, 0.908617, 1.126648, 1.160507, 1.270722, 1.114232 ] } }, { "behavior_uid": "3828c0459cdff72c", "cpu_freqs_after": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "cpu_freqs_before": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 } ], "image_height": 1500, "image_width": 1080, "mispredictions": [ { "correct_answer": "Pokémon", "mispredicted_image": "misprediction-image-b034bc3b08f1ed02.jpg", "misprediction_results": "0.1466 - \"n09229709 bubble\"\n0.0579 - \"n02708093 analog clock\"\n0.0481 - \"n04525305 vending machine\"\n0.0377 - \"n02988304 CD player\"\n0.0296 - \"n04286575 spotlight, spot\"\n" } ], "prediction": "0.1466 - \"n09229709 bubble\"\n0.0579 - \"n02708093 analog clock\"\n0.0481 - \"n04525305 vending machine\"\n0.0377 - \"n02988304 CD player\"\n0.0296 - \"n04286575 spotlight, spot\"\n", "time": [ 6198, 6402, 5891 ], "user": "", "xopenme": { "execution_time": [ 4.675663, 4.698606, 4.817789 ], "execution_time_kernel_0": [ 4.675663, 4.698606, 4.817789 ], "execution_time_kernel_1": [ 0.054171, 0.060915, 0.060815 ], "execution_time_kernel_2": [ 1.358366, 1.546856, 0.914497 ] } }, { "behavior_uid": "79498dd8efb4349b", "cpu_freqs_after": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "cpu_freqs_before": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "image_height": 1280, "image_width": 720, "mispredictions": [ { "correct_answer": "Chess game ", "mispredicted_image": "misprediction-image-4735935952653dc7.jpg", "misprediction_results": "0.6813 - \"n06359193 web site, website, internet site, site\"\n0.0493 - \"n03782006 monitor\"\n0.0277 - \"n04264628 space bar\"\n0.0210 - \"n06785654 crossword puzzle, crossword\"\n0.0194 - \"n03085013 computer keyboard, keypad\"\n" } ], "prediction": "0.6813 - \"n06359193 web site, website, internet site, site\"\n0.0493 - \"n03782006 monitor\"\n0.0277 - \"n04264628 space bar\"\n0.0210 - \"n06785654 crossword puzzle, crossword\"\n0.0194 - \"n03085013 computer keyboard, keypad\"\n", "time": [ 6270, 6177, 5999 ], "user": "", "xopenme": { "execution_time": [ 4.843188, 4.688223, 4.65865 ], "execution_time_kernel_0": [ 4.843188, 4.688223, 4.65865 ], "execution_time_kernel_1": [ 0.022929, 0.021646, 0.021163 ], "execution_time_kernel_2": [ 1.30319, 1.373706, 1.232607 ] } }, { "behavior_uid": "f0bf1eeaea474484", "cpu_freqs_after": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "cpu_freqs_before": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "image_height": 1080, "image_width": 1920, "prediction": "0.1553 - \"n09193705 alp\"\n0.1465 - \"n10565667 scuba diver\"\n0.0564 - \"n09256479 coral reef\"\n0.0524 - \"n01484850 great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\"\n0.0465 - \"n09472597 volcano\"\n", "time": [ 5927, 6457, 6285 ], "user": "", "xopenme": { "execution_time": [ 4.734063, 4.793912, 4.682214 ], "execution_time_kernel_0": [ 4.734063, 4.793912, 4.682214 ], "execution_time_kernel_1": [ 0.135006, 0.136707, 0.125224 ], "execution_time_kernel_2": [ 0.968667, 1.432546, 1.391231 ] } }, { "behavior_uid": "25b9ccf47e591ee4", "cpu_freqs_after": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "cpu_freqs_before": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "image_height": 2204, "image_width": 3920, "mispredictions": [ { "correct_answer": "Church ", "mispredicted_image": "misprediction-image-9f5acece34471aad.jpg", "misprediction_results": "0.9249 - \"n02843684 birdhouse\"\n0.0163 - \"n04372370 switch, electric switch, electrical switch\"\n0.0113 - \"n02825657 bell cote, bell cot\"\n0.0087 - \"n03874599 padlock\"\n0.0038 - \"n03467068 guillotine\"\n" }, { "correct_answer": "Dragon", "mispredicted_image": "misprediction-image-331224752b6daa70.jpg", "misprediction_results": "0.0952 - \"n04462240 toyshop\"\n0.0653 - \"n02966193 carousel, carrousel, merry-go-round, roundabout, whirligig\"\n0.0416 - \"n01704323 triceratops\"\n0.0271 - \"n03724870 mask\"\n0.0266 - \"n07248320 book jacket, dust cover, dust jacket, dust wrapper\"\n" }, { "correct_answer": "Harbour", "mispredicted_image": "misprediction-image-ff24b79a318eaa08.jpg", "misprediction_results": "0.3403 - \"n09332890 lakeside, lakeshore\"\n0.1559 - \"n09428293 seashore, coast, seacoast, sea-coast\"\n0.1153 - \"n02980441 castle\"\n0.1040 - \"n09399592 promontory, headland, head, foreland\"\n0.0509 - \"n09246464 cliff, drop, drop-off\"\n" }, { "correct_answer": "lighthouse ", "mispredicted_image": "misprediction-image-0beeb9ea40a91a5a.jpg", "misprediction_results": "0.1539 - \"n03804744 nail\"\n0.1354 - \"n01592084 chickadee\"\n0.1298 - \"n03930313 picket fence, paling\"\n0.0389 - \"n02843684 birdhouse\"\n0.0343 - \"n04209239 shower curtain\"\n" } ], "prediction": "0.9249 - \"n02843684 birdhouse\"\n0.0163 - \"n04372370 switch, electric switch, electrical switch\"\n0.0113 - \"n02825657 bell cote, bell cot\"\n0.0087 - \"n03874599 padlock\"\n0.0038 - \"n03467068 guillotine\"\n", "time": [ 7381, 6575, 6073, 6687, 7131, 7762, 7688, 7231, 7371, 7563, 6196, 6353, 8868, 6562, 6410 ], "user": "", "xopenme": { "execution_time": [ 4.961814, 4.85542, 4.65964, 4.747525, 5.544053, 5.617284, 5.716429, 5.605276, 5.596001, 5.54489, 4.770914, 4.911514, 6.017488, 5.13965, 4.858525 ], "execution_time_kernel_0": [ 4.961814, 4.85542, 4.65964, 4.747525, 5.544053, 5.617284, 5.716429, 5.605276, 5.596001, 5.54489, 4.770914, 4.911514, 6.017488, 5.13965, 4.858525 ], "execution_time_kernel_1": [ 0.391391, 0.37799, 0.413537, 0.47794, 0.571311, 0.575262, 0.527965, 0.528979, 0.536327, 0.445288, 0.403128, 0.413541, 0.483114, 0.393509, 0.406223 ], "execution_time_kernel_2": [ 1.930494, 1.25136, 0.889971, 1.352167, 0.911297, 1.456243, 1.325155, 0.998999, 1.140197, 1.465589, 0.916202, 0.922208, 2.218943, 0.919595, 1.052906 ] } }, { "behavior_uid": "16e46029b358232b", "cpu_freqs_after": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "cpu_freqs_before": [ { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362, "6": 2362, "7": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 }, { "0": 1709, "1": 1709, "2": 1709, "3": 1709, "4": 2362, "5": 2362 } ], "image_height": 1836, "image_width": 3264, "mispredictions": [ { "correct_answer": "Tank", "mispredicted_image": "misprediction-image-b068fda984f1a407.jpg", "misprediction_results": "0.2579 - \"n03160309 dam, dike, dyke\"\n0.0571 - \"n09246464 cliff, drop, drop-off\"\n0.0541 - \"n02859443 boathouse\"\n0.0535 - \"n03220513 dome\"\n0.0509 - \"n04417672 thatch, thatched roof\"\n" }, { "correct_answer": "Statue", "mispredicted_image": "misprediction-image-7b3973de818bc7c1.jpg", "misprediction_results": "0.1351 - \"n02124075 Egyptian cat\"\n0.0531 - \"n04590129 window shade\"\n0.0428 - \"n04589890 window screen\"\n0.0349 - \"n02123045 tabby, tabby cat\"\n0.0320 - \"n04040759 radiator\"\n" }, { "correct_answer": "food", "mispredicted_image": "misprediction-image-14c6fe24a5685ab1.jpg", "misprediction_results": "0.5276 - \"n07583066 guacamole\"\n0.2368 - \"n07711569 mashed potato\"\n0.1186 - \"n07880968 burrito\"\n0.0341 - \"n07871810 meat loaf, meatloaf\"\n0.0166 - \"n07715103 cauliflower\"\n" }, { "correct_answer": "television", "mispredicted_image": "misprediction-image-421fbee18c54a895.jpg", "misprediction_results": "0.2010 - \"n03709823 mailbag, postbag\"\n0.1624 - \"n02769748 backpack, back pack, knapsack, packsack, rucksack, haversack\"\n0.0554 - \"n04026417 purse\"\n0.0553 - \"n04152593 screen, CRT screen\"\n0.0422 - \"n03782006 monitor\"\n" }, { "correct_answer": "rainbow", "mispredicted_image": "misprediction-image-d72b947218de905a.jpg", "misprediction_results": "0.1415 - \"n03837869 obelisk\"\n0.1080 - \"n04523525 vault\"\n0.0583 - \"n04486054 triumphal arch\"\n0.0514 - \"n09246464 cliff, drop, drop-off\"\n0.0388 - \"n03788195 mosque\"\n" } ], "prediction": "0.2579 - \"n03160309 dam, dike, dyke\"\n0.0571 - \"n09246464 cliff, drop, drop-off\"\n0.0541 - \"n02859443 boathouse\"\n0.0535 - \"n03220513 dome\"\n0.0509 - \"n04417672 thatch, thatched roof\"\n", "time": [ 7574, 7155, 6958, 7153, 7005, 7328, 7749, 7290, 6997, 7843, 6251, 6638, 6237, 5697, 6204, 6360, 5664, 5714 ], "user": "", "xopenme": { "execution_time": [ 5.9703, 5.634519, 5.603053, 5.695265, 5.619414, 5.644725, 5.778211, 5.613607, 5.61855, 5.654234, 4.71684, 4.793698, 4.946096, 4.50086, 4.632477, 4.880706, 4.425571, 4.461382 ], "execution_time_kernel_0": [ 5.9703, 5.634519, 5.603053, 5.695265, 5.619414, 5.644725, 5.778211, 5.613607, 5.61855, 5.654234, 4.71684, 4.793698, 4.946096, 4.50086, 4.632477, 4.880706, 4.425571, 4.461382 ], "execution_time_kernel_1": [ 0.359595, 0.354692, 0.361867, 0.350546, 0.355296, 0.355011, 0.311494, 0.308172, 0.311208, 0.322614, 0.309333, 0.293495, 0.273844, 0.280749, 0.274506, 0.320058, 0.286498, 0.300537 ], "execution_time_kernel_2": [ 1.130488, 1.070549, 0.891042, 0.988112, 0.929911, 1.226935, 1.532162, 1.268745, 0.961459, 1.745365, 1.135861, 1.449002, 0.902825, 0.815133, 1.2114, 1.030821, 0.860259, 0.864034 ] } } ], "meta": { "cpu_abi": "arm64-v8a", "cpu_name": "AArch64 Processor rev 4 (aarch64)", "cpu_uid": "961465bb3cc347c2", "crowd_uid": "a7340ffbefcb5923", "engine": "Caffe CPU", "gpgpu_name": "", "gpgpu_uid": "", "gpu_name": "ARM Mali-T830", "gpu_uid": "cd4f4a81a14cb70f", "model": "BVLC AlexNet", "os_name": "Android 7.0", "os_uid": "de98569847a92092", "plat_name": "UNKNOWN GENERIC_A15", "platform_uid": "fd90149f38ddfbc3" } }
Upload zip file:
Overwrite existing files:
Developed by
Grigori Fursin
Implemented as a
CK workflow
Hosted at