I don’t get what you mean with “I’m trying to maximize precision”, when you train you simply minimize the loss, unless you wrote a custom loss to only maximize precision (not advised) if you use crossentropy the measure more similar to what you maximize is the weighted accuracy (but even that tecnically is not correct, you minimize the loss, not maximize accuracy or recall).
In your case since you have unblanced data it’s probable that it will learn to classify the bigger class more than the smaller. For example, classifying all instances as the bigger class without undertanding anything of the problem would get it a 70% accuracy.
If you just want to account for the unbalance in the data I would just give the bigger class a weight of 0.3 and the other a weight of 0.7 in the loss function. In this way if it tries to cheat and simply predict always a class it would get a low score. If you are more interested in one class than the other set different weights. (I wouldn’t use a validation_precision ealy stopping as done in the tutorial linked though, for the same reason as before).
Or follow exactly the example tutorial.
Anyway you can simply try, make more classifiers, plot the behaviour of the different classifiers for different trasholds and choose the more suitable.