AutoMM for Chinese Named Entity Recognition =========================================== In this tutorial, we will demonstrate how to use AutoMM for Chinese Named Entity Recognition using an e-commerce dataset extracted from one of the most popular online marketplaces, `TaoBao.com `__. The dataset is collected and labelled by `Jie et al. `__ and the text column mainly consists of product descriptions. The following figure shows an example of Taobao product description. .. figure:: https://automl-mm-bench.s3.amazonaws.com/ner/images_for_tutorial/chinese_ner.png :width: 200px Taobao product description. A rabbit toy for lunar new year decoration. Load the Data ------------- We have preprocessed the dataset to make it ready-to-use with AutoMM. .. code:: python import autogluon.multimodal from autogluon.core.utils.loaders import load_pd from autogluon.multimodal.utils import visualize_ner train_data = load_pd.load('https://automl-mm-bench.s3.amazonaws.com/ner/taobao-ner/chinese_ner_train.csv') dev_data = load_pd.load('https://automl-mm-bench.s3.amazonaws.com/ner/taobao-ner/chinese_ner_dev.csv') train_data.head(5) .. parsed-literal:: :class: output Failed to import tensorrt package. onnxruntime would fallback to CUDAExecutionProvider instead of using TensorrtExecutionProvider. .. raw:: html
text_snippet entity_annotations
0 雄争霸点卡/七雄争霸元宝/七雄争霸100元1000元宝直充,自动充值 [{"entity_group": "HCCX", "start": 3, "end": 5...
1 简约韩版粗跟艾熙百思图亲子鞋冬季百搭街头母女圆头翻边绒面厚底 [{"entity_group": "HPPX", "start": 6, "end": 8...
2 羚跑商务背包双肩包男士防盗多功能出差韩版休闲15.6寸电脑包皮潮 [{"entity_group": "HPPX", "start": 0, "end": 2...
3 热水袋防爆充电暖宝Ÿœ卡通毛绒萌萌可爱注水暖宫暖手宝暖水袋 [{"entity_group": "HCCX", "start": 0, "end": 3...
4 童装11周岁13儿童夏装男童套装2017新款10中大童15男孩12秋季5潮7 [{"entity_group": "HCCX", "start": 0, "end": 2...
HPPX, HCCX, XH, and MISC stand for brand, product, pattern, and Miscellaneous information (e.g., product Specification), respectively. Let’s visualize one of the examples, which is about *online games top up services*. .. code:: python visualize_ner(train_data["text_snippet"].iloc[0], train_data["entity_annotations"].iloc[0]) .. raw:: html 雄争霸点卡 HCCX /七雄争霸 MISC 元宝 HCCX /七雄争霸 MISC 100元 MISC 1000 MISC 元宝 HCCX 直充,自动充值 Training -------- With AutoMM, the process of Chinese entity recognition is the same as English entity recognition. All you need to do is to select a suitable foundation model checkpoint that are pretrained on Chinese or multilingual documents. Here we use the ``'hfl/chinese-lert-small'`` backbone for demonstration purpose. Now, let’s create a predictor for named entity recognition by setting the problem_type to ner and specifying the label column. Afterwards, we call predictor.fit() to train the model for a few minutes. .. code:: python from autogluon.multimodal import MultiModalPredictor import uuid label_col = "entity_annotations" model_path = f"./tmp/{uuid.uuid4().hex}-automm_ner" # You can rename it to the model path you like predictor = MultiModalPredictor(problem_type="ner", label=label_col, path=model_path) predictor.fit( train_data=train_data, hyperparameters={'model.ner_text.checkpoint_name':'hfl/chinese-lert-small'}, time_limit=300, #second ) .. parsed-literal:: :class: output Global seed set to 123 AutoMM starts to create your model. ✨ - Model will be saved to "/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner". - Validation metric is "ner_token_f1". - To track the learning progress, you can open a terminal and launch Tensorboard: ```shell # Assume you have installed tensorboard tensorboard --logdir /home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner ``` Enjoy your coffee, and let AutoMM do the job ☕☕☕ Learn more at https://auto.gluon.ai Using 16bit None Automatic Mixed Precision (AMP) GPU available: True (cuda), used: True TPU available: False, using: 0 TPU cores IPU available: False, using: 0 IPUs HPU available: False, using: 0 HPUs LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0] | Name | Type | Params -------------------------------------------------------- 0 | model | HFAutoModelForNER | 15.1 M 1 | validation_metric | F1Score | 0 2 | loss_func | CrossEntropyLoss | 0 -------------------------------------------------------- 15.1 M Trainable params 0 Non-trainable params 15.1 M Total params 30.173 Total estimated model params size (MB) Epoch 0, global step 21: 'val_ner_token_f1' reached 0.39867 (best 0.39867), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=0-step=21.ckpt' as top 3 Epoch 0, global step 42: 'val_ner_token_f1' reached 0.65621 (best 0.65621), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=0-step=42.ckpt' as top 3 Epoch 1, global step 64: 'val_ner_token_f1' reached 0.74061 (best 0.74061), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=1-step=64.ckpt' as top 3 Epoch 1, global step 85: 'val_ner_token_f1' reached 0.75858 (best 0.75858), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=1-step=85.ckpt' as top 3 Epoch 2, global step 107: 'val_ner_token_f1' reached 0.77960 (best 0.77960), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=2-step=107.ckpt' as top 3 Epoch 2, global step 128: 'val_ner_token_f1' reached 0.79538 (best 0.79538), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=2-step=128.ckpt' as top 3 Epoch 3, global step 150: 'val_ner_token_f1' reached 0.80615 (best 0.80615), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=3-step=150.ckpt' as top 3 Epoch 3, global step 171: 'val_ner_token_f1' reached 0.81192 (best 0.81192), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=3-step=171.ckpt' as top 3 Epoch 4, global step 193: 'val_ner_token_f1' reached 0.81475 (best 0.81475), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=4-step=193.ckpt' as top 3 Time limit reached. Elapsed time is 0:05:00. Signaling Trainer to stop. Epoch 4, global step 209: 'val_ner_token_f1' reached 0.82675 (best 0.82675), saving model to '/home/ci/autogluon/docs/_build/eval/tutorials/multimodal/text_prediction/tmp/251ab2705c3f4d23b8a604bc2ed93932-automm_ner/epoch=4-step=209.ckpt' as top 3 Start to fuse 3 checkpoints via the greedy soup algorithm. .. parsed-literal:: :class: output Downloading builder script: 0%| | 0.00/6.34k [00:00 Evaluation ---------- To check the model performance on the test dataset, all you need to do is to call ``predictor.evaluate(...)``. .. code:: python predictor.evaluate(dev_data) .. parsed-literal:: :class: output {'hccx': {'precision': 0.777818448023426, 'recall': 0.8330066640533125, 'f1': 0.8044671588112815, 'number': 2551}, 'hppx': {'precision': 0.5086705202312138, 'recall': 0.6330935251798561, 'f1': 0.5641025641025641, 'number': 278}, 'misc': {'precision': 0.5936952714535902, 'recall': 0.6726190476190477, 'f1': 0.6306976744186047, 'number': 504}, 'xh': {'precision': 0.5875912408759124, 'recall': 0.6764705882352942, 'f1': 0.6289062499999999, 'number': 238}, 'overall_precision': 0.7139943920469028, 'overall_recall': 0.7843741248949874, 'overall_f1': 0.7475313584200692, 'overall_accuracy': 0.869505400013758} Prediction and Visualization ---------------------------- You can easily obtain the predictions given an input sentence by by calling ``predictor.predict(...)``. .. code:: python output = predictor.predict(dev_data) visualize_ner(dev_data["text_snippet"].iloc[0], output[0]) .. raw:: html 家用防尘厨房厨师帽子 HCCX 车间工厂鸭舌工作帽 HCCX 男女食堂餐厅食HCCX 卫生帽 HCCX Now, let’s make predictions on the rabbit toy example. .. code:: python sentence = "2023年兔年挂件新年装饰品小挂饰乔迁之喜门挂小兔子" predictions = predictor.predict({'text_snippet': [sentence]}) visualize_ner(sentence, predictions[0]) .. raw:: html 2023年兔年挂件 HCCX 新年装饰品 HCCX 小挂饰 HCCX HPPX MISC HPPX 喜门挂 HCCX 小兔子 HCCX Other Examples -------------- You may go to `AutoMM Examples `__ to explore other examples about AutoMM. Customization ------------- To learn how to customize AutoMM, please refer to :ref:`sec_automm_customization`.