Engineering Variable-Length Sequences in TensorFlow Part 3: Using a Sentence-Conditioned BERT Encoder To conclude this series, we examine the benefits of using a sentence-conditioned BERT model for multi-sentence text data.
Engineering Variable-Length Sequences in TensorFlow Part 2: Training a Simple BERT Model In this article we demonstrate how to use a BERT model with variable-length text data while minimizing training time.
Engineering Variable-Length Sequences in TensorFlow Part 1: Optimizing Sequence Padding We analyze the impact of sequence padding techniques on model training time for variable-length text data.
Engineering Improving Dataflow Pipelines for Text Data Processing This post discusses recipes to improve Cloud Dataflow pipelines for large-scale datasets involving sequential text data.