Diabetes-Vorhersage

Wir importieren zunächst Keras und Numpy und laden den Datensatz.

In [1]:
import keras
C:\Users\Sebastian\Anaconda3\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
Using Theano backend.
In [2]:
import numpy as np
In [6]:
data = np.loadtxt('diabetes.csv', delimiter=',')

Die Daten bestehen aus 8 features mit Diabetes-Indikatoren sowie einem binären Ausgabewert, welcher angibt ob der Patient fünf Jahre später Diabetes entwickelt hat.

In [7]:
data
Out[7]:
array([[  6.   , 148.   ,  72.   , ...,   0.627,  50.   ,   1.   ],
       [  1.   ,  85.   ,  66.   , ...,   0.351,  31.   ,   0.   ],
       [  8.   , 183.   ,  64.   , ...,   0.672,  32.   ,   1.   ],
       ...,
       [  5.   , 121.   ,  72.   , ...,   0.245,  30.   ,   0.   ],
       [  1.   , 126.   ,  60.   , ...,   0.349,  47.   ,   1.   ],
       [  1.   ,  93.   ,  70.   , ...,   0.315,  23.   ,   0.   ]])
In [8]:
data.shape
Out[8]:
(768, 9)

Wir trennen die Daten in Trainings- und Testdaten auf.

In [9]:
X = data[:,:8]
In [10]:
y = data[:, 8]

Wir verwenden ein sequentielles Netzwerk, d.h. eines bei welchem Informationen lediglich von einer Lage zur nächsten weiter gegeben werden.

In [11]:
from keras.models import Sequential

Wir bauen nun zunächst das Netzwerk Lage für Lage auf, und müssen dazu zunächst eine Instanz eines sequentiellen Netzwerken initialisieren.

In [12]:
model = Sequential()

Unser Netzwerk besteht lediglich aus dichten Lagen.

In [13]:
from keras.layers import Dense

Der ersten Lage muss die Input-Dimension, d.h. die Anzahl features vorgegeben werden. Jede Lage benötigt darüber hinaus eine Initialisation der Gewichte sowie eine nichtlineare Aktivierungsfunktion.

In [16]:
model.add(Dense(12, input_dim=8, kernel_initializer='uniform', activation='relu'))
In [18]:
model.add(Dense(8, kernel_initializer='uniform', activation='relu'))

Die Ausgabelage muss der Problemstellung entsprechen. Hier handelt es sich lediglich um eine Ja/Nein-Entscheidung, d.h. eine binäre Klassifikation, es wird daher lediglich ein Knoten verwendet welcher die Wahrscheinlichkeit ausgibt.

In [19]:
model.add(Dense(1, kernel_initializer='uniform', activation='relu'))

Unser Netzwerk ist vollständig aufgebaut und kann nun kompiliert werden, d.h. wir versehen es mit einer loss-Funktion, einer Optimierungsmethode und einer Auswertungsmetrik.

In [20]:
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

Nun kann das Netzwerk mit den oben bereit gestellten Daten trainiert werden.

In [21]:
model.fit(X, y, epochs=150, batch_size=10, verbose=2)
Epoch 1/150
 - 0s - loss: 1.1076 - acc: 0.6576
Epoch 2/150
 - 0s - loss: 0.6063 - acc: 0.6771
Epoch 3/150
 - 0s - loss: 0.5989 - acc: 0.6758
Epoch 4/150
 - 0s - loss: 0.5932 - acc: 0.6810
Epoch 5/150
 - 0s - loss: 0.5918 - acc: 0.6901
Epoch 6/150
 - 0s - loss: 0.5845 - acc: 0.6797
Epoch 7/150
 - 0s - loss: 0.5824 - acc: 0.6901
Epoch 8/150
 - 0s - loss: 0.5732 - acc: 0.6953
Epoch 9/150
 - 0s - loss: 0.5675 - acc: 0.7266
Epoch 10/150
 - 0s - loss: 0.5627 - acc: 0.7240
Epoch 11/150
 - 0s - loss: 0.5531 - acc: 0.7057
Epoch 12/150
 - 0s - loss: 0.5474 - acc: 0.7266
Epoch 13/150
 - 0s - loss: 0.5503 - acc: 0.7292
Epoch 14/150
 - 0s - loss: 0.5411 - acc: 0.7435
Epoch 15/150
 - 0s - loss: 0.5561 - acc: 0.7357
Epoch 16/150
 - 0s - loss: 0.5641 - acc: 0.7227
Epoch 17/150
 - 0s - loss: 0.5453 - acc: 0.7214
Epoch 18/150
 - 0s - loss: 0.5411 - acc: 0.7370
Epoch 19/150
 - 0s - loss: 0.5620 - acc: 0.7253
Epoch 20/150
 - 0s - loss: 0.5409 - acc: 0.7266
Epoch 21/150
 - 0s - loss: 0.5335 - acc: 0.7292
Epoch 22/150
 - 0s - loss: 0.5317 - acc: 0.7461
Epoch 23/150
 - 0s - loss: 0.5339 - acc: 0.7331
Epoch 24/150
 - 0s - loss: 0.5358 - acc: 0.7370
Epoch 25/150
 - 0s - loss: 0.5321 - acc: 0.7370
Epoch 26/150
 - 0s - loss: 0.5298 - acc: 0.7422
Epoch 27/150
 - 0s - loss: 0.5277 - acc: 0.7318
Epoch 28/150
 - 0s - loss: 0.5301 - acc: 0.7292
Epoch 29/150
 - 0s - loss: 0.5262 - acc: 0.7383
Epoch 30/150
 - 0s - loss: 0.5460 - acc: 0.7292
Epoch 31/150
 - 0s - loss: 0.5225 - acc: 0.7461
Epoch 32/150
 - 0s - loss: 0.5283 - acc: 0.7435
Epoch 33/150
 - 0s - loss: 0.5231 - acc: 0.7474
Epoch 34/150
 - 0s - loss: 0.5388 - acc: 0.7487
Epoch 35/150
 - 0s - loss: 0.5168 - acc: 0.7409
Epoch 36/150
 - 0s - loss: 0.5177 - acc: 0.7435
Epoch 37/150
 - 0s - loss: 0.5196 - acc: 0.7526
Epoch 38/150
 - 0s - loss: 0.5183 - acc: 0.7370
Epoch 39/150
 - 0s - loss: 0.5329 - acc: 0.7474
Epoch 40/150
 - 0s - loss: 0.5127 - acc: 0.7461
Epoch 41/150
 - 0s - loss: 0.5379 - acc: 0.7435
Epoch 42/150
 - 0s - loss: 0.5071 - acc: 0.7383
Epoch 43/150
 - 0s - loss: 0.5339 - acc: 0.7474
Epoch 44/150
 - 0s - loss: 0.5018 - acc: 0.7565
Epoch 45/150
 - 0s - loss: 0.5181 - acc: 0.7448
Epoch 46/150
 - 0s - loss: 0.5055 - acc: 0.7526
Epoch 47/150
 - 0s - loss: 0.5143 - acc: 0.7682
Epoch 48/150
 - 0s - loss: 0.5654 - acc: 0.7370
Epoch 49/150
 - 0s - loss: 0.5193 - acc: 0.7435
Epoch 50/150
 - 0s - loss: 0.5085 - acc: 0.7682
Epoch 51/150
 - 0s - loss: 0.5063 - acc: 0.7578
Epoch 52/150
 - 0s - loss: 0.5018 - acc: 0.7630
Epoch 53/150
 - 0s - loss: 0.5215 - acc: 0.7552
Epoch 54/150
 - 0s - loss: 0.5123 - acc: 0.7578
Epoch 55/150
 - 0s - loss: 0.5136 - acc: 0.7513
Epoch 56/150
 - 0s - loss: 0.5254 - acc: 0.7565
Epoch 57/150
 - 0s - loss: 0.4955 - acc: 0.7812
Epoch 58/150
 - 0s - loss: 0.4899 - acc: 0.7604
Epoch 59/150
 - 0s - loss: 0.4963 - acc: 0.7513
Epoch 60/150
 - 0s - loss: 0.4897 - acc: 0.7552
Epoch 61/150
 - 0s - loss: 0.4996 - acc: 0.7695
Epoch 62/150
 - 0s - loss: 0.5099 - acc: 0.7617
Epoch 63/150
 - 0s - loss: 0.6380 - acc: 0.7344
Epoch 64/150
 - 0s - loss: 0.5120 - acc: 0.7552
Epoch 65/150
 - 0s - loss: 0.4980 - acc: 0.7682
Epoch 66/150
 - 0s - loss: 0.4921 - acc: 0.7695
Epoch 67/150
 - 0s - loss: 0.4878 - acc: 0.7617
Epoch 68/150
 - 0s - loss: 0.4963 - acc: 0.7591
Epoch 69/150
 - 0s - loss: 0.4850 - acc: 0.7526
Epoch 70/150
 - 0s - loss: 0.4962 - acc: 0.7552
Epoch 71/150
 - 0s - loss: 0.4777 - acc: 0.7708
Epoch 72/150
 - 0s - loss: 0.4813 - acc: 0.7760
Epoch 73/150
 - 0s - loss: 0.4852 - acc: 0.7617
Epoch 74/150
 - 0s - loss: 0.4802 - acc: 0.7695
Epoch 75/150
 - 0s - loss: 0.4844 - acc: 0.7591
Epoch 76/150
 - 0s - loss: 0.4811 - acc: 0.7552
Epoch 77/150
 - 0s - loss: 0.4810 - acc: 0.7695
Epoch 78/150
 - 0s - loss: 0.5463 - acc: 0.7461
Epoch 79/150
 - 0s - loss: 0.5016 - acc: 0.7552
Epoch 80/150
 - 0s - loss: 0.4898 - acc: 0.7669
Epoch 81/150
 - 0s - loss: 0.4793 - acc: 0.7669
Epoch 82/150
 - 0s - loss: 0.4812 - acc: 0.7578
Epoch 83/150
 - 0s - loss: 0.4734 - acc: 0.7656
Epoch 84/150
 - 0s - loss: 0.4716 - acc: 0.7578
Epoch 85/150
 - 0s - loss: 0.8901 - acc: 0.7383
Epoch 86/150
 - 0s - loss: 0.5346 - acc: 0.7357
Epoch 87/150
 - 0s - loss: 0.5222 - acc: 0.7526
Epoch 88/150
 - 0s - loss: 0.5037 - acc: 0.7552
Epoch 89/150
 - 0s - loss: 0.4978 - acc: 0.7656
Epoch 90/150
 - 0s - loss: 0.4903 - acc: 0.7695
Epoch 91/150
 - 0s - loss: 0.4834 - acc: 0.7695
Epoch 92/150
 - 0s - loss: 0.4807 - acc: 0.7578
Epoch 93/150
 - 0s - loss: 0.4812 - acc: 0.7747
Epoch 94/150
 - 0s - loss: 0.4770 - acc: 0.7695
Epoch 95/150
 - 0s - loss: 0.4799 - acc: 0.7630
Epoch 96/150
 - 0s - loss: 0.4750 - acc: 0.7591
Epoch 97/150
 - 0s - loss: 0.4988 - acc: 0.7604
Epoch 98/150
 - 0s - loss: 0.4777 - acc: 0.7565
Epoch 99/150
 - 0s - loss: 0.4767 - acc: 0.7617
Epoch 100/150
 - 0s - loss: 0.4938 - acc: 0.7669
Epoch 101/150
 - 0s - loss: 0.4905 - acc: 0.7591
Epoch 102/150
 - 0s - loss: 0.4863 - acc: 0.7565
Epoch 103/150
 - 0s - loss: 0.4753 - acc: 0.7630
Epoch 104/150
 - 0s - loss: 0.4833 - acc: 0.7643
Epoch 105/150
 - 0s - loss: 0.4756 - acc: 0.7695
Epoch 106/150
 - 0s - loss: 0.4840 - acc: 0.7656
Epoch 107/150
 - 0s - loss: 0.4739 - acc: 0.7695
Epoch 108/150
 - 0s - loss: 0.4737 - acc: 0.7669
Epoch 109/150
 - 0s - loss: 0.7047 - acc: 0.7578
Epoch 110/150
 - 0s - loss: 0.6306 - acc: 0.7305
Epoch 111/150
 - 0s - loss: 0.5035 - acc: 0.7604
Epoch 112/150
 - 0s - loss: 0.5001 - acc: 0.7578
Epoch 113/150
 - 0s - loss: 0.4978 - acc: 0.7682
Epoch 114/150
 - 0s - loss: 0.4953 - acc: 0.7773
Epoch 115/150
 - 0s - loss: 0.4938 - acc: 0.7734
Epoch 116/150
 - 0s - loss: 0.4921 - acc: 0.7773
Epoch 117/150
 - 0s - loss: 0.4922 - acc: 0.7747
Epoch 118/150
 - 0s - loss: 0.4911 - acc: 0.7695
Epoch 119/150
 - 0s - loss: 0.4890 - acc: 0.7669
Epoch 120/150
 - 0s - loss: 0.4885 - acc: 0.7695
Epoch 121/150
 - 0s - loss: 0.4897 - acc: 0.7812
Epoch 122/150
 - 0s - loss: 0.4896 - acc: 0.7669
Epoch 123/150
 - 0s - loss: 0.4845 - acc: 0.7708
Epoch 124/150
 - 0s - loss: 0.4839 - acc: 0.7656
Epoch 125/150
 - 0s - loss: 0.4834 - acc: 0.7721
Epoch 126/150
 - 0s - loss: 0.4836 - acc: 0.7773
Epoch 127/150
 - 0s - loss: 0.4842 - acc: 0.7773
Epoch 128/150
 - 0s - loss: 0.4813 - acc: 0.7721
Epoch 129/150
 - 0s - loss: 0.4805 - acc: 0.7708
Epoch 130/150
 - 0s - loss: 0.4790 - acc: 0.7734
Epoch 131/150
 - 0s - loss: 0.4795 - acc: 0.7708
Epoch 132/150
 - 0s - loss: 0.4968 - acc: 0.7669
Epoch 133/150
 - 0s - loss: 0.4793 - acc: 0.7708
Epoch 134/150
 - 0s - loss: 0.4781 - acc: 0.7760
Epoch 135/150
 - 0s - loss: 0.4780 - acc: 0.7682
Epoch 136/150
 - 0s - loss: 0.4759 - acc: 0.7682
Epoch 137/150
 - 0s - loss: 0.4764 - acc: 0.7604
Epoch 138/150
 - 0s - loss: 0.4924 - acc: 0.7747
Epoch 139/150
 - 0s - loss: 0.4948 - acc: 0.7500
Epoch 140/150
 - 0s - loss: 0.4757 - acc: 0.7799
Epoch 141/150
 - 0s - loss: 0.4797 - acc: 0.7708
Epoch 142/150
 - 0s - loss: 0.4735 - acc: 0.7656
Epoch 143/150
 - 0s - loss: 0.5145 - acc: 0.7565
Epoch 144/150
 - 0s - loss: 0.4779 - acc: 0.7721
Epoch 145/150
 - 0s - loss: 0.4784 - acc: 0.7734
Epoch 146/150
 - 0s - loss: 0.4920 - acc: 0.7682
Epoch 147/150
 - 0s - loss: 0.4758 - acc: 0.7656
Epoch 148/150
 - 0s - loss: 0.4723 - acc: 0.7708
Epoch 149/150
 - 0s - loss: 0.4915 - acc: 0.7721
Epoch 150/150
 - 0s - loss: 0.4737 - acc: 0.7734
Out[21]:
<keras.callbacks.History at 0x275da361f60>

Zu guter Letzt machen wir Vorhersagen über die Trainingsdatenpunkte (in der Realität würden wir Vorhersagen für neue Patienten treffen wollen).

In [22]:
predictions = model.predict(X)
In [23]:
predictions
Out[23]:
array([[0.5895569 ],
       [0.07209119],
       [0.73943496],
       [0.06971409],
       [0.56744736],
       [0.16736916],
       [0.07916891],
       [0.70113474],
       [0.70780486],
       [0.0841507 ],
       [0.18372138],
       [0.8001631 ],
       [0.5328422 ],
       [0.71191406],
       [0.555714  ],
       [0.47276732],
       [0.2875331 ],
       [0.19324683],
       [0.35026702],
       [0.21732982],
       [0.30933672],
       [0.2545547 ],
       [0.8757068 ],
       [0.33838716],
       [0.5762364 ],
       [0.4197745 ],
       [0.56793463],
       [0.07042789],
       [0.52501196],
       [0.24016228],
       [0.34543806],
       [0.46688247],
       [0.0728752 ],
       [0.06707445],
       [0.3895758 ],
       [0.13321255],
       [0.53622144],
       [0.31856197],
       [0.1429341 ],
       [0.3741851 ],
       [0.6868686 ],
       [0.46828988],
       [0.1393371 ],
       [0.76773703],
       [0.57003015],
       [0.8659701 ],
       [0.4002729 ],
       [0.06005776],
       [0.35258985],
       [0.14101723],
       [0.06599392],
       [0.08628587],
       [0.08068702],
       [0.718072  ],
       [0.5749956 ],
       [0.05452718],
       [0.8083579 ],
       [0.21861757],
       [0.49239084],
       [0.1789858 ],
       [0.05686887],
       [0.43347737],
       [0.04493207],
       [0.32203665],
       [0.3189259 ],
       [0.12529095],
       [0.14071925],
       [0.28644183],
       [0.06560431],
       [0.34078446],
       [0.13168833],
       [0.37461427],
       [0.5658737 ],
       [0.25700983],
       [0.06769021],
       [0.        ],
       [0.07279848],
       [0.20497392],
       [0.6636096 ],
       [0.10735328],
       [0.17919326],
       [0.04462092],
       [0.09104152],
       [0.07668725],
       [0.45904562],
       [0.16208826],
       [0.5043411 ],
       [0.15888073],
       [0.6552896 ],
       [0.08669265],
       [0.05224052],
       [0.25216594],
       [0.24913485],
       [0.33615628],
       [0.2492256 ],
       [0.47164097],
       [0.08670838],
       [0.05000031],
       [0.14847542],
       [0.34895256],
       [0.5971281 ],
       [0.35396078],
       [0.08032336],
       [0.06060084],
       [0.10264611],
       [0.23605497],
       [0.04924574],
       [0.43090764],
       [0.08891423],
       [0.08212325],
       [0.5457552 ],
       [0.59975713],
       [0.07412757],
       [0.07831982],
       [0.6239992 ],
       [0.4465526 ],
       [0.32213917],
       [0.09430327],
       [0.12090737],
       [0.07660224],
       [0.71305764],
       [0.34443936],
       [0.13506192],
       [0.37328795],
       [0.10941653],
       [0.38743445],
       [0.35305658],
       [0.22175236],
       [0.1897703 ],
       [0.12001473],
       [0.55922335],
       [0.49768436],
       [0.5877022 ],
       [0.21343096],
       [0.07287034],
       [0.27997196],
       [0.08363944],
       [0.07668909],
       [0.21225034],
       [0.19163083],
       [0.22466187],
       [0.27317992],
       [0.18407305],
       [0.3345028 ],
       [0.4582184 ],
       [0.03954626],
       [0.0761354 ],
       [0.199728  ],
       [0.5287767 ],
       [0.07189755],
       [0.340557  ],
       [0.1528316 ],
       [0.62634087],
       [0.42582133],
       [0.97430027],
       [0.76058996],
       [0.08401039],
       [0.09865476],
       [0.07110548],
       [0.98744696],
       [0.4193007 ],
       [0.2946699 ],
       [0.211605  ],
       [0.09220731],
       [0.21882159],
       [0.20307608],
       [0.4816708 ],
       [0.26032925],
       [0.2014639 ],
       [0.09633338],
       [0.1473803 ],
       [0.41865358],
       [0.2970481 ],
       [0.10492131],
       [0.07012684],
       [0.76489055],
       [0.09122595],
       [0.4852252 ],
       [0.56170994],
       [0.41767147],
       [0.07272561],
       [0.21455128],
       [0.        ],
       [0.06877968],
       [0.3533007 ],
       [0.98359317],
       [0.759527  ],
       [0.25289276],
       [0.23797137],
       [0.35615662],
       [0.08912145],
       [0.43038675],
       [0.5996698 ],
       [1.1006861 ],
       [0.11734909],
       [0.5520561 ],
       [0.08092462],
       [0.09125753],
       [0.26383296],
       [0.413294  ],
       [0.11050481],
       [0.34579274],
       [0.10487781],
       [0.07216534],
       [0.30858874],
       [0.15180631],
       [0.9158619 ],
       [0.55557126],
       [0.08642142],
       [0.82327026],
       [0.07115921],
       [0.4973069 ],
       [0.73680705],
       [0.41002068],
       [0.30627376],
       [0.730506  ],
       [0.27156097],
       [0.32748485],
       [0.09043264],
       [0.31512952],
       [0.5536517 ],
       [0.5182219 ],
       [0.53967977],
       [0.54641956],
       [0.07831366],
       [0.07469267],
       [0.08769138],
       [0.74758476],
       [0.8077539 ],
       [0.25368407],
       [0.48058912],
       [0.51700443],
       [0.05611557],
       [0.29950526],
       [0.06647255],
       [0.74146956],
       [0.75419265],
       [0.68727833],
       [0.6729693 ],
       [0.06722261],
       [0.07612352],
       [0.09079918],
       [0.34957346],
       [0.33310965],
       [0.42384344],
       [0.81847817],
       [0.4012627 ],
       [0.53931004],
       [0.39131892],
       [0.08883794],
       [0.3380303 ],
       [0.18480437],
       [0.06440316],
       [0.07987553],
       [0.24419133],
       [0.23461889],
       [0.27558473],
       [0.1441769 ],
       [0.60418403],
       [0.70625335],
       [0.6746963 ],
       [0.6829059 ],
       [0.13433205],
       [0.44886073],
       [0.29965076],
       [0.19945642],
       [0.6686811 ],
       [0.486491  ],
       [0.0781022 ],
       [0.63563305],
       [0.4529831 ],
       [0.09046721],
       [0.16289611],
       [0.06236198],
       [0.41328052],
       [0.21338904],
       [0.2359142 ],
       [0.08335976],
       [0.24873711],
       [0.08874044],
       [0.3956612 ],
       [0.4746934 ],
       [0.35750178],
       [0.54729044],
       [0.1273351 ],
       [0.41710165],
       [0.51236427],
       [0.3051916 ],
       [0.08090873],
       [0.2633597 ],
       [0.06516571],
       [0.18109244],
       [0.4039388 ],
       [0.39273772],
       [0.47913122],
       [0.57225865],
       [0.3222306 ],
       [0.15518028],
       [0.38674638],
       [0.29128027],
       [0.8276815 ],
       [0.3904337 ],
       [0.08483292],
       [0.36119807],
       [0.31093732],
       [0.2846022 ],
       [0.6114304 ],
       [0.17693706],
       [0.24184991],
       [0.27460632],
       [0.08554484],
       [0.16396908],
       [0.3669616 ],
       [0.21152718],
       [0.3494076 ],
       [0.18476546],
       [0.07020616],
       [0.62849605],
       [0.27367434],
       [0.7190055 ],
       [0.28175753],
       [0.18205415],
       [0.18738846],
       [0.60957646],
       [0.20065816],
       [0.32880637],
       [0.2765111 ],
       [0.8469053 ],
       [0.18067068],
       [0.21362847],
       [0.3324531 ],
       [0.0819103 ],
       [1.0500433 ],
       [0.23906846],
       [0.07246021],
       [0.5869061 ],
       [0.49053064],
       [0.25663805],
       [0.58553505],
       [0.766526  ],
       [0.17916763],
       [0.07885982],
       [0.        ],
       [0.27507567],
       [0.29732618],
       [0.4536896 ],
       [0.35570243],
       [0.39165202],
       [0.07946455],
       [1.3373611 ],
       [0.14696142],
       [0.3102507 ],
       [0.06673962],
       [0.07436867],
       [0.11677957],
       [0.6160863 ],
       [0.3241913 ],
       [1.0172082 ],
       [0.30512467],
       [0.7166516 ],
       [0.70229685],
       [0.5730408 ],
       [0.27216038],
       [0.569302  ],
       [0.4250739 ],
       [0.2201675 ],
       [0.2641747 ],
       [0.07014193],
       [0.06076901],
       [0.2482962 ],
       [0.641325  ],
       [0.0635729 ],
       [0.07709409],
       [0.18288778],
       [0.34491232],
       [0.65037996],
       [0.06556613],
       [0.08948396],
       [0.6626669 ],
       [0.11488572],
       [0.11696096],
       [0.07183263],
       [0.08935665],
       [0.07654001],
       [0.13904445],
       [0.11696821],
       [0.29404393],
       [0.33824316],
       [0.46112844],
       [0.13983671],
       [0.12074298],
       [0.7294199 ],
       [0.1359011 ],
       [0.14991082],
       [0.5143467 ],
       [0.2726329 ],
       [0.11425443],
       [0.330027  ],
       [0.05949184],
       [0.8301632 ],
       [0.10839078],
       [0.40419888],
       [0.399704  ],
       [0.08844759],
       [0.6415042 ],
       [0.39018106],
       [0.23284797],
       [0.07225645],
       [0.8566473 ],
       [0.60608965],
       [0.20633699],
       [0.1620355 ],
       [0.39510715],
       [0.25364497],
       [0.33119318],
       [0.5070923 ],
       [0.08309449],
       [0.50964344],
       [0.05378422],
       [0.24901854],
       [0.28355566],
       [0.07549699],
       [0.17456938],
       [0.20953041],
       [0.6314435 ],
       [0.6767885 ],
       [0.0599705 ],
       [0.63577354],
       [0.3009824 ],
       [0.08892514],
       [0.22332503],
       [0.08514638],
       [0.06323802],
       [0.24680279],
       [0.07733023],
       [0.7238082 ],
       [0.63975495],
       [0.41704825],
       [0.06210953],
       [0.23009516],
       [0.5986199 ],
       [0.07636279],
       [0.2463106 ],
       [0.27208677],
       [0.25076482],
       [1.0782622 ],
       [0.07869227],
       [0.0861641 ],
       [0.11620075],
       [0.14586264],
       [0.05527132],
       [0.26306185],
       [0.08836402],
       [0.40392032],
       [0.18750818],
       [0.994036  ],
       [0.38311228],
       [0.08431631],
       [0.6572355 ],
       [0.533589  ],
       [0.31078976],
       [0.04851277],
       [0.14439075],
       [0.08566014],
       [0.22558482],
       [0.11476016],
       [0.05866349],
       [0.11112216],
       [0.6282226 ],
       [0.642445  ],
       [0.4831878 ],
       [0.3376555 ],
       [0.27393892],
       [0.38753518],
       [0.17287199],
       [0.30217072],
       [0.14045273],
       [0.18464589],
       [0.3352695 ],
       [0.3704131 ],
       [0.50309664],
       [0.20451851],
       [0.0815285 ],
       [0.07273169],
       [0.7980564 ],
       [0.36471123],
       [0.37782562],
       [0.7369922 ],
       [0.08733299],
       [0.8176099 ],
       [0.08611383],
       [0.08450818],
       [0.16894811],
       [0.32997993],
       [0.05528471],
       [0.61284596],
       [0.15323009],
       [0.0693533 ],
       [0.7802367 ],
       [0.51449597],
       [0.08677961],
       [0.08997101],
       [1.9493209 ],
       [0.23401242],
       [0.17506164],
       [0.09255132],
       [0.56712633],
       [0.23789452],
       [0.08254011],
       [0.3377843 ],
       [0.22235873],
       [0.17591202],
       [0.17010815],
       [0.07636673],
       [0.08700106],
       [0.4897532 ],
       [0.53127295],
       [0.41127768],
       [0.2058809 ],
       [0.2357076 ],
       [0.0510007 ],
       [0.24355854],
       [0.1755795 ],
       [0.48472297],
       [0.2870058 ],
       [0.06963111],
       [0.06404011],
       [0.11523388],
       [0.14372809],
       [0.09048226],
       [0.22605397],
       [0.20907705],
       [0.14432336],
       [0.38059333],
       [0.07977726],
       [0.6513427 ],
       [0.08543688],
       [0.05311858],
       [0.24281721],
       [0.33164605],
       [0.322965  ],
       [0.27997413],
       [0.29224625],
       [0.08379459],
       [0.07583348],
       [0.73624736],
       [0.87745696],
       [0.3199232 ],
       [0.5316379 ],
       [0.60598975],
       [0.12570606],
       [0.08043826],
       [0.26556793],
       [0.07651256],
       [0.08315733],
       [0.28334817],
       [0.14294367],
       [0.2842928 ],
       [0.5321407 ],
       [0.1512181 ],
       [0.38139287],
       [0.75712067],
       [0.08614056],
       [0.15598135],
       [0.07422596],
       [0.08255413],
       [0.12646857],
       [0.18175814],
       [0.4800356 ],
       [0.22355184],
       [0.07749959],
       [0.09280685],
       [0.19961505],
       [0.10001844],
       [0.2685801 ],
       [0.3489029 ],
       [0.23761049],
       [0.2728309 ],
       [0.4294338 ],
       [0.9909552 ],
       [0.47941744],
       [0.21570523],
       [0.4357316 ],
       [0.2809262 ],
       [0.32907453],
       [0.07142197],
       [0.54858375],
       [0.11278896],
       [0.6545174 ],
       [0.07483946],
       [0.5694967 ],
       [0.21781023],
       [0.348061  ],
       [0.08262691],
       [0.35673523],
       [0.5500968 ],
       [0.07798102],
       [0.12226097],
       [0.57561797],
       [0.1130829 ],
       [0.08173153],
       [0.3265175 ],
       [0.1898071 ],
       [0.5978296 ],
       [0.96614295],
       [0.34990242],
       [0.64883035],
       [0.06658797],
       [0.426477  ],
       [0.08342533],
       [0.15498178],
       [0.6302931 ],
       [0.6653133 ],
       [0.22280575],
       [0.59511113],
       [0.08832008],
       [0.1660443 ],
       [0.04513311],
       [0.3459591 ],
       [0.44057074],
       [0.19310549],
       [0.0801888 ],
       [0.805667  ],
       [0.11103599],
       [0.11036314],
       [0.10910323],
       [0.11030555],
       [0.21157189],
       [0.35439602],
       [0.07973541],
       [0.26881024],
       [0.08897406],
       [0.10756864],
       [0.13077797],
       [0.14718631],
       [0.3368977 ],
       [0.17460385],
       [0.0822569 ],
       [0.28111523],
       [0.06689227],
       [0.0780924 ],
       [0.31961736],
       [0.45705983],
       [0.30728945],
       [0.09449092],
       [0.4735801 ],
       [0.40342623],
       [0.6709613 ],
       [0.4552608 ],
       [0.08689806],
       [0.07315281],
       [0.22304219],
       [0.31520104],
       [0.20499109],
       [0.10605526],
       [0.4811332 ],
       [0.07919292],
       [0.32056665],
       [0.45694658],
       [0.08115917],
       [0.6090056 ],
       [0.9745218 ],
       [0.6228325 ],
       [0.5972535 ],
       [0.38891265],
       [0.14770328],
       [0.50612557],
       [0.3137658 ],
       [0.24793406],
       [0.5739204 ],
       [0.65737075],
       [0.08061938],
       [0.08468042],
       [0.4236506 ],
       [0.27256772],
       [0.8197883 ],
       [0.51890737],
       [0.08511235],
       [0.33509988],
       [0.08086202],
       [0.04363567],
       [0.69135326],
       [0.15816797],
       [0.26014906],
       [0.15237099],
       [0.26537114],
       [0.20556562],
       [0.14402637],
       [0.23369366],
       [0.5143643 ],
       [0.18276522],
       [0.6818846 ],
       [0.3086898 ],
       [0.49975458],
       [0.06988247],
       [0.36844847],
       [0.50732553],
       [0.21354474],
       [0.26994073],
       [0.3988069 ],
       [0.24287312],
       [0.3470089 ],
       [0.58889437],
       [0.6492482 ],
       [0.12600012],
       [0.11269003],
       [0.28879827],
       [0.2888694 ],
       [0.64999825],
       [0.12221391],
       [0.42124254],
       [0.31692448],
       [0.67280644],
       [0.19582106],
       [0.09159772],
       [0.80543333],
       [0.5999286 ],
       [0.21277049],
       [0.18382727],
       [0.2367105 ],
       [0.07490087],
       [0.2062316 ],
       [0.39799216],
       [0.3236741 ],
       [0.1290846 ],
       [0.3065321 ],
       [0.18564036],
       [0.28590456],
       [0.39844832],
       [0.08254585],
       [0.2602701 ],
       [0.22469507],
       [0.6389795 ],
       [0.11699127],
       [0.10115474],
       [0.20061237],
       [0.11901152],
       [0.08476239],
       [0.12798734],
       [0.18071584],
       [0.54137427],
       [0.17158435],
       [0.09641667],
       [0.47524807],
       [0.73926836],
       [0.2894539 ],
       [0.5327551 ],
       [0.15051661],
       [0.70629317],
       [0.54484457],
       [0.3882943 ],
       [0.2648126 ],
       [0.13151464],
       [0.57952815],
       [0.6249445 ],
       [0.31406048],
       [0.43088958],
       [0.29145566],
       [0.13097991],
       [0.80226105],
       [0.08116482],
       [0.91451746],
       [0.09105016],
       [0.34516758],
       [0.30997443],
       [0.21672715],
       [0.28807965],
       [0.07999108]], dtype=float32)

Wir wandeln noch die Wahrscheinlichkeiten in eine Ja/Nein-Vorhersage um.

In [24]:
rounded = [round(x[0]) for x in predictions]
In [25]:
print(rounded)
[1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 2.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]

MNIST: Vorhersage handschriftlicher Ziffern

In [30]:
from keras.datasets import mnist
In [ ]:
data = mnist.load_data()
In [26]:
data = np.load('mnist.npz')
In [27]:
data.keys()
Out[27]:
['x_test', 'x_train', 'y_train', 'y_test']
In [36]:
X_train = data['x_train']
X_test = data['x_test']
y_train = data['y_train']
y_test = data['y_test']
In [66]:
X_test_plot = data['x_test']
In [31]:
print(X_train.shape)
(60000, 28, 28)
In [32]:
import matplotlib.pyplot as plt
In [33]:
plt.imshow(X_train[0])
Out[33]:
<matplotlib.image.AxesImage at 0x275da6d1240>
In [37]:
X_train = X_train.reshape(X_train.shape[0], 28, 28, 1)
X_test = X_test.reshape(X_test.shape[0], 28, 28, 1)
In [38]:
print(X_train.shape)
(60000, 28, 28, 1)
In [40]:
X_train = X_train.astype('float32')
In [41]:
X_test = X_test.astype('float32')
In [42]:
X_train /= 255
In [43]:
X_test /= 255
In [44]:
y_train.shape
Out[44]:
(60000,)
In [45]:
y_train[:10]
Out[45]:
array([5, 0, 4, 1, 9, 2, 1, 3, 1, 4], dtype=uint8)
In [46]:
from keras.utils import np_utils
In [47]:
Y_train = np_utils.to_categorical(y_train, 10)
Y_test = np_utils.to_categorical(y_test, 10)
In [48]:
Y_train[:10]
Out[48]:
array([[0., 0., 0., 0., 0., 1., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
       [0., 0., 0., 0., 1., 0., 0., 0., 0., 0.],
       [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.],
       [0., 0., 0., 0., 0., 0., 0., 0., 0., 1.],
       [0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],
       [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.],
       [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],
       [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.],
       [0., 0., 0., 0., 1., 0., 0., 0., 0., 0.]], dtype=float32)
In [49]:
model = Sequential()
In [50]:
from keras.layers import Dense, Dropout, Activation, Flatten
from keras.layers import Convolution2D, MaxPooling2D
In [51]:
model.add(Convolution2D(32, (3,3), activation='relu', input_shape=(28,28,1)))
In [52]:
model.add(Convolution2D(32, (3,3), activation='relu'))
In [53]:
model.add(MaxPooling2D(pool_size=(2,2)))
In [54]:
model.add(Dropout(0.25))
In [55]:
model.add(Flatten())
In [56]:
model.add(Dense(128, activation='relu'))
In [57]:
model.add(Dropout(0.25))
In [58]:
model.add(Dense(10, activation='softmax'))
In [59]:
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
In [61]:
model.fit(X_train, Y_train, batch_size=32, nb_epoch=10)
C:\Users\Sebastian\Anaconda3\lib\site-packages\ipykernel_launcher.py:1: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.
  """Entry point for launching an IPython kernel.
Epoch 1/10
60000/60000 [==============================] - 157s 3ms/step - loss: 0.1530 - acc: 0.9533
Epoch 2/10
60000/60000 [==============================] - 167s 3ms/step - loss: 0.0582 - acc: 0.9825
Epoch 3/10
60000/60000 [==============================] - 168s 3ms/step - loss: 0.0435 - acc: 0.9866
Epoch 4/10
60000/60000 [==============================] - 166s 3ms/step - loss: 0.0345 - acc: 0.9890
Epoch 5/10
60000/60000 [==============================] - 168s 3ms/step - loss: 0.0302 - acc: 0.9905
Epoch 6/10
60000/60000 [==============================] - 165s 3ms/step - loss: 0.0239 - acc: 0.9921
Epoch 7/10
60000/60000 [==============================] - 163s 3ms/step - loss: 0.0205 - acc: 0.9931
Epoch 8/10
60000/60000 [==============================] - 161s 3ms/step - loss: 0.0171 - acc: 0.9945
Epoch 9/10
60000/60000 [==============================] - 167s 3ms/step - loss: 0.0155 - acc: 0.9949
Epoch 10/10
60000/60000 [==============================] - 160s 3ms/step - loss: 0.0152 - acc: 0.9949
Out[61]:
<keras.callbacks.History at 0x275db588cf8>
In [62]:
score = model.evaluate(X_test, Y_test, verbose=0)
In [63]:
score
Out[63]:
[0.03348006168393522, 0.9925]
In [67]:
plt.imshow(X_test_plot[0])
Out[67]:
<matplotlib.image.AxesImage at 0x275de47afd0>
In [69]:
model.predict(X_test)[0]
Out[69]:
array([4.04813854e-17, 4.27083976e-12, 3.23459569e-16, 8.13762252e-12,
       7.12233578e-19, 1.30600265e-16, 6.17592616e-24, 1.00000000e+00,
       1.02562075e-17, 1.10087122e-11], dtype=float32)
In [70]:
predictions = model.predict(X_test)
In [77]:
maximum = predictions.max(axis=1)
In [96]:
worst = [pos for pos in range(len(maximum)) if maximum[pos] < 0.5]
In [97]:
worst
Out[97]:
[659, 2109, 3225, 3558, 3778]
In [95]:
maximum[maximum < 0.5]
Out[95]:
array([0.4468845 , 0.49009094, 0.4399654 , 0.45273992, 0.48349267],
      dtype=float32)
In [101]:
print(predictions[worst[0]])
plt.imshow(X_test_plot[worst[0]])
[1.8582364e-08 4.4688451e-01 4.4022337e-01 1.0641524e-04 2.0676863e-07
 2.8719619e-08 4.8901544e-08 1.1275269e-01 1.0769197e-05 2.1960152e-05]
Out[101]:
<matplotlib.image.AxesImage at 0x275df5c2828>
In [106]:
y_test[worst[0]]
Out[106]:
2
In [102]:
print(predictions[worst[1]])
plt.imshow(X_test_plot[worst[1]])
[1.11594955e-08 3.62636627e-07 4.90090936e-01 7.70639032e-02
 4.63825527e-05 1.42198387e-05 9.87513715e-09 3.44737507e-02
 2.82777622e-02 3.70032668e-01]
Out[102]:
<matplotlib.image.AxesImage at 0x275df61c0b8>
In [107]:
y_test[worst[1]]
Out[107]:
3
In [103]:
print(predictions[worst[2]])
plt.imshow(X_test_plot[worst[2]])
[2.9086951e-07 2.9520717e-01 2.5429060e-06 1.8368043e-08 7.7149051e-04
 1.6801668e-08 3.4765837e-09 2.6258096e-01 1.4720911e-03 4.3996540e-01]
Out[103]:
<matplotlib.image.AxesImage at 0x275df66a9e8>
In [108]:
y_test[worst[2]]
Out[108]:
7
In [104]:
print(predictions[worst[3]])
plt.imshow(X_test_plot[worst[3]])
[4.5273992e-01 1.6775812e-04 1.7671537e-04 3.9434606e-01 5.1602549e-03
 1.3727109e-01 2.6351993e-03 5.2576601e-03 9.3135604e-05 2.1522050e-03]
Out[104]:
<matplotlib.image.AxesImage at 0x275df6c7278>
In [109]:
y_test[worst[3]]
Out[109]:
5
In [105]:
print(predictions[worst[4]])
plt.imshow(X_test_plot[worst[4]])
[1.35544411e-04 7.51636890e-07 1.00647576e-01 6.27233589e-04
 2.79875144e-06 4.83492672e-01 9.12043499e-04 1.12293565e-04
 3.85679334e-01 2.83897724e-02]
Out[105]:
<matplotlib.image.AxesImage at 0x275df716ba8>
In [110]:
y_test[worst[4]]
Out[110]:
5