doctag_lockshave float elements? Why? The comments seemed clear enough that these control whether weights can be further changed during
trainexecution. Boolean seems the most appropriate type. Would there be any reason for this? Are the values simply used elsewhere to multiply weights? What would be the
docvecssection have the same (+-1%) value. What's the matter and how to heal that?
normmeans "vector with length of