A:
You can get data from a Kaggle notebook and I can see that there is some code in this notebook that might be useful to you
import tensorflow as tf
# Number of distinct training examples.
num_examples = 6
# Number of unique image features.
num_features = 10
# Number of classes.
num_classes = 7
# Size of a single minibatch entry.
sample_batch_size = 10
# Training Epochs.
train_steps = 500000
# Learning rate decay.
momentum = 0.9
# Minimum validation score for stopping training.
valid_steps = 1000
# Number of evaluation images per training example.
num_eval_images = 1
# Random seed.
random_seed = 17
# Evaluate image features from training data.
# A vector of indices in which elements correspond to image features.
eval_feature_indices = [10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]
# Evaluate image features from validation data.
# Only the index at which the validation score is calculated
# is used.
valid_feature_indices = [8]
# Determines the type of image features to evaluate.
eval_feature_type = tf.contrib.layers.real_valued_column("image_features")
# Initialise Variables.
# Initialise variables for minibatch examples.
# The data tensor is not used, so it can be set to a very large value.
global_step = tf.Variable(0, trainable=False)
# Store feature vectors in minibatch entries.
# Note the use of a list of indices for evaluation data.
train_minibatch_features = tf.contrib.layers.batch_features(
tf.constant(train_features), sample_batch_size)
valid_minibatch_features = tf.contrib.layers.batch_features(
tf.constant(valid_features), sample_batch_size
Related links:
Commentaires