{ "cells": [ { "cell_type": "code", "execution_count": null, "id": "8d76e41a-be59-4638-8c3d-9d6a3ce6cc19", "metadata": {}, "outputs": [], "source": [ "#This Jupyter File contains the following scripts for MINISTRAL-3B-BF16:\n", "\n", "#1)The Training Code used to train the LoRA adapters for the model and the output losses (if available).\n", "#->The model can be ran again to check the for the losses.\n", "\n", "#2)The Testing Code used to test the 5 variants of the model at different base precisions using the same BF16 LoRA Adapters.\n", "\n", "#3) The Evaluation Code used to evaluate the responses of the combined model and LoRA Adapters." ] }, { "cell_type": "code", "execution_count": null, "id": "8a617207-c0fa-4f5f-baa6-de109ee29622", "metadata": {}, "outputs": [], "source": [ "#TRAINING SCRIPT FOR MINISTRAL-3B-BF16" ] }, { "cell_type": "code", "execution_count": null, "id": "2a9806a2-ca91-45e2-b533-0f082b9c32dc", "metadata": { "scrolled": true }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/home/jovyan/Falcon1B/lib/python3.11/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n", " from .autonotebook import tqdm as notebook_tqdm\n", "[nltk_data] Downloading package punkt to /home/jovyan/nltk_data...\n", "[nltk_data] Package punkt is already up-to-date!\n", "Map: 100%|██████████| 15000/15000 [00:24<00:00, 617.56 examples/s]\n", "Map: 100%|██████████| 1500/1500 [00:02<00:00, 647.97 examples/s]\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "\n", "Sample 0:\n", "Input:\n", " Question: I have written a canny edge detection algorithm for a project. I want to know is there any method to link the broken segments of an edge, since i am getting a single edge as a conglomeration of a few segments. I am getting around 100 segments, which i am sure can be decreased with some intelligence. Please help.\n", "Answer: You can use a method named dynamic programming. A very good intro on this can be found on chapter 6 of Sonka's digital image processing book\n", "Label mask:\n", " [-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 28747, 995, 541, 938, 264, 2038, 5160, 10616, 16292, 28723, 330, 1215, 1179, 24671, 356, 456, 541, 347, 1419, 356, 10661, 28705, 28784, 302, 7179, 2117, 28742, 28713, 7153, 3469, 9457, 1820, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2]\n", "\n", "Sample 1:\n", "Input:\n", " Context: I have a dataset of book reviews:\n", "```\n", "| user_id | ISBN | vote | votes_for_user | average_user_vote | ISBN_categ |\n", " 213 3242X 4.5 12 3.4 1 \n", " 563 1245X 3.2 74 2.3 2\n", "```\n", "\n", "where \n", "```\n", " vote = rating given by user to a certain book\n", " votes_for_user = number of votes the user has in the dataset (nr of rows)\n", " average_user_vote = average of a user's votes\n", " ISBN_categ = integer categorical of the ISBN (since that is a string).\n", "```\n", "\n", "I want to apply a clustering algorithm such as DBSCAN to see how many clusters I can form with this dataset. \n", "My question is: \n", "Should I apply the clustering on the dataframe as it is (minus the ISBN column) or should I construct more features for every user and construct a dataframe where every user appears only once, together with their features, and cluster that? \n", "Remember, the intent here is to cluster users (by user_id), not data points (votes).\n", "Question: Clustering of users in a dataset\n", "Answer: If your objective is to find clusters of users, then you are interested in finding groups of \"similar\" reviewers.\n", "Therefore you should:\n", "\n", "- Retain information which relates to the users in a meaningful way - e.g. votes_for_user.\n", "\n", "- Discard information which has no meaningful relationship to a user - e.g. user_id (unless perhaps it contains some information such as time / order).\n", "\n", "- Be mindful of fields which may contain implicit relationships involving a user - e.g. vote may be a result of the interaction between user and ISBN.\n", "Label mask:\n", " [-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 28747, 1047, 574, 13640, 349, 298, 1300, 20501, 302, 5443, 28725, 868, 368, 460, 6348, 297, 7484, 4938, 302, 345, 4410, 3052, 28739, 4058, 404, 28723, 13, 5816, 994, 368, 1023, 28747, 13, 13, 28733, 8337, 426, 1871, 690, 1016, 1002, 298, 272, 5443, 297, 264, 19258, 1069, 387, 317, 28723, 28721, 28723, 12768, 28730, 1392, 28730, 1838, 28723, 13, 13, 28733, 3433, 5538, 1871, 690, 659, 708, 19258, 3758, 298, 264, 2188, 387, 317, 28723, 28721, 28723, 2188, 28730, 313, 325, 370, 1503, 5230, 378, 5876, 741, 1871, 1259, 390, 727, 732, 1745, 609, 13, 13, 28733, 1739, 2273, 1007, 302, 5080, 690, 993, 7001, 21628, 9391, 14971, 264, 2188, 387, 317, 28723, 28721, 28723, 7893, 993, 347, 264, 1204, 302, 272, 11186, 1444, 2188, 304, 9337, 28723, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2]\n", "\n", "Sample 2:\n", "Input:\n", " Question: What's a common technical challenge when using logistic regression?\n", "Answer: Dealing with class imbalance, which is when the number of observations in one class is significantly lower than the number of observations in the other class.\n", "Label mask:\n", " [-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 28747, 1343, 4726, 395, 875, 503, 18024, 28725, 690, 349, 739, 272, 1474, 302, 13875, 297, 624, 875, 349, 11117, 3889, 821, 272, 1474, 302, 13875, 297, 272, 799, 875, 28723, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2]\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "Loading checkpoint shards: 100%|██████████| 3/3 [00:15<00:00, 5.11s/it]\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "trainable params: 73,400,320 || all params: 3,389,116,416 || trainable%: 2.1658\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/tmp/ipykernel_569719/4074026002.py:175: FutureWarning: `tokenizer` is deprecated and will be removed in version 5.0.0 for `Trainer.__init__`. Use `processing_class` instead.\n", " trainer = Trainer(model=model, args=training_args, train_dataset=train_dataset, eval_dataset=test_dataset, tokenizer=tokenizer)\n", "Warning: The following arguments do not match the ones in the `trainer_state.json` within the checkpoint directory: \n", "\tsave_steps: 200 (from args) != 100 (from trainer_state.json)\n" ] }, { "data": { "text/html": [ "\n", "
Step | \n", "Training Loss | \n", "Validation Loss | \n", "
---|---|---|
110 | \n", "2.815900 | \n", "1.579408 | \n", "
120 | \n", "2.783500 | \n", "1.573392 | \n", "
130 | \n", "2.923600 | \n", "1.566912 | \n", "
140 | \n", "2.849400 | \n", "1.560509 | \n", "
150 | \n", "2.923500 | \n", "1.554506 | \n", "
160 | \n", "2.831600 | \n", "1.549700 | \n", "
170 | \n", "2.808500 | \n", "1.544646 | \n", "
180 | \n", "2.740600 | \n", "1.539551 | \n", "
190 | \n", "2.940100 | \n", "1.534406 | \n", "
200 | \n", "2.895300 | \n", "1.528924 | \n", "
210 | \n", "2.601100 | \n", "1.524439 | \n", "
220 | \n", "2.727100 | \n", "1.521273 | \n", "
230 | \n", "2.894900 | \n", "1.517426 | \n", "
240 | \n", "2.780100 | \n", "1.513114 | \n", "
250 | \n", "2.768500 | \n", "1.508721 | \n", "
260 | \n", "2.675600 | \n", "1.504693 | \n", "
270 | \n", "2.764800 | \n", "1.502155 | \n", "
280 | \n", "2.724500 | \n", "1.497066 | \n", "
290 | \n", "2.937800 | \n", "1.493658 | \n", "
300 | \n", "2.759800 | \n", "1.491170 | \n", "
310 | \n", "2.704700 | \n", "1.486125 | \n", "
320 | \n", "2.627200 | \n", "1.483599 | \n", "
330 | \n", "2.778300 | \n", "1.478788 | \n", "
340 | \n", "2.604400 | \n", "1.478175 | \n", "
350 | \n", "2.707500 | \n", "1.474785 | \n", "
360 | \n", "2.650300 | \n", "1.471617 | \n", "
370 | \n", "2.692400 | \n", "1.468295 | \n", "
380 | \n", "2.698500 | \n", "1.468557 | \n", "
390 | \n", "2.695300 | \n", "1.465367 | \n", "
400 | \n", "2.553200 | \n", "1.463014 | \n", "
410 | \n", "2.472200 | \n", "1.461501 | \n", "
420 | \n", "2.642600 | \n", "1.456720 | \n", "
430 | \n", "2.543800 | \n", "1.455238 | \n", "
440 | \n", "2.597000 | \n", "1.451704 | \n", "
450 | \n", "2.669300 | \n", "1.448452 | \n", "
460 | \n", "2.543300 | \n", "1.448239 | \n", "
470 | \n", "2.541200 | \n", "1.444733 | \n", "
480 | \n", "2.597900 | \n", "1.439093 | \n", "
490 | \n", "2.645600 | \n", "1.438198 | \n", "
500 | \n", "2.544600 | \n", "1.434592 | \n", "
510 | \n", "2.659000 | \n", "1.433304 | \n", "
520 | \n", "2.612200 | \n", "1.431921 | \n", "
530 | \n", "2.494500 | \n", "1.428637 | \n", "
540 | \n", "2.528800 | \n", "1.425427 | \n", "
550 | \n", "2.498100 | \n", "1.419751 | \n", "
560 | \n", "2.612700 | \n", "1.420925 | \n", "
570 | \n", "2.462800 | \n", "1.418557 | \n", "
580 | \n", "2.594300 | \n", "1.414138 | \n", "
590 | \n", "2.661800 | \n", "1.408737 | \n", "
600 | \n", "2.407800 | \n", "1.406123 | \n", "
610 | \n", "2.489500 | \n", "1.405649 | \n", "
620 | \n", "2.471900 | \n", "1.403713 | \n", "
630 | \n", "2.527500 | \n", "1.399968 | \n", "
640 | \n", "2.473100 | \n", "1.404927 | \n", "
650 | \n", "2.446800 | \n", "1.404123 | \n", "
660 | \n", "2.534300 | \n", "1.403481 | \n", "
670 | \n", "2.438200 | \n", "1.400137 | \n", "
680 | \n", "2.392500 | \n", "1.396842 | \n", "
690 | \n", "2.476300 | \n", "1.392489 | \n", "
700 | \n", "2.531600 | \n", "1.390788 | \n", "
710 | \n", "2.460600 | \n", "1.391676 | \n", "
720 | \n", "2.460800 | \n", "1.390317 | \n", "
730 | \n", "2.417900 | \n", "1.389893 | \n", "
740 | \n", "2.390200 | \n", "1.390820 | \n", "
750 | \n", "2.335300 | \n", "1.390020 | \n", "
760 | \n", "2.347400 | \n", "1.384070 | \n", "
770 | \n", "2.403500 | \n", "1.381093 | \n", "
780 | \n", "2.505800 | \n", "1.382190 | \n", "
790 | \n", "2.383400 | \n", "1.383101 | \n", "
800 | \n", "2.489100 | \n", "1.376631 | \n", "
810 | \n", "2.337500 | \n", "1.377839 | \n", "
820 | \n", "2.455500 | \n", "1.372652 | \n", "
830 | \n", "2.490800 | \n", "1.375561 | \n", "
840 | \n", "2.335800 | \n", "1.377074 | \n", "
850 | \n", "2.332400 | \n", "1.381941 | \n", "
860 | \n", "2.397000 | \n", "1.375122 | \n", "
870 | \n", "2.366900 | \n", "1.375663 | \n", "
880 | \n", "2.381700 | \n", "1.375288 | \n", "
890 | \n", "2.321400 | \n", "1.375537 | \n", "
900 | \n", "2.319900 | \n", "1.371483 | \n", "
910 | \n", "2.354500 | \n", "1.370443 | \n", "
920 | \n", "2.373600 | \n", "1.369424 | \n", "
930 | \n", "2.314700 | \n", "1.364816 | \n", "
940 | \n", "2.317400 | \n", "1.369500 | \n", "
950 | \n", "2.268700 | \n", "1.368308 | \n", "
960 | \n", "2.347400 | \n", "1.364427 | \n", "
970 | \n", "2.370800 | \n", "1.368014 | \n", "
980 | \n", "2.344800 | \n", "1.363824 | \n", "
990 | \n", "2.270900 | \n", "1.364423 | \n", "
1000 | \n", "2.354900 | \n", "1.361838 | \n", "
1010 | \n", "2.368000 | \n", "1.360603 | \n", "
1020 | \n", "2.387500 | \n", "1.358095 | \n", "
1030 | \n", "2.370000 | \n", "1.357302 | \n", "
1040 | \n", "2.433400 | \n", "1.361243 | \n", "
1050 | \n", "2.337800 | \n", "1.363192 | \n", "
1060 | \n", "2.245900 | \n", "1.360574 | \n", "
1070 | \n", "2.189400 | \n", "1.365487 | \n", "
1080 | \n", "2.370400 | \n", "1.361998 | \n", "
1090 | \n", "2.241400 | \n", "1.362879 | \n", "
1100 | \n", "2.366700 | \n", "1.361105 | \n", "
1110 | \n", "2.187300 | \n", "1.369426 | \n", "
1120 | \n", "2.270900 | \n", "1.364161 | \n", "
1130 | \n", "2.280600 | \n", "1.363960 | \n", "
1140 | \n", "2.284100 | \n", "1.360710 | \n", "
1150 | \n", "2.226500 | \n", "1.363731 | \n", "
1160 | \n", "2.290000 | \n", "1.360991 | \n", "
1170 | \n", "2.154400 | \n", "1.361415 | \n", "
1180 | \n", "2.248800 | \n", "1.355173 | \n", "
1190 | \n", "2.315400 | \n", "1.351709 | \n", "
1200 | \n", "2.220900 | \n", "1.356196 | \n", "
1210 | \n", "2.240800 | \n", "1.353414 | \n", "
1220 | \n", "2.236200 | \n", "1.351822 | \n", "
1230 | \n", "2.274200 | \n", "1.352525 | \n", "
1240 | \n", "2.267300 | \n", "1.357203 | \n", "
1250 | \n", "2.277900 | \n", "1.349335 | \n", "
1260 | \n", "2.210600 | \n", "1.355585 | \n", "
1270 | \n", "2.168500 | \n", "1.360445 | \n", "
1280 | \n", "2.166600 | \n", "1.360253 | \n", "
1290 | \n", "2.259600 | \n", "1.361347 | \n", "
1300 | \n", "2.123500 | \n", "1.360584 | \n", "
1310 | \n", "2.146800 | \n", "1.354987 | \n", "
1320 | \n", "2.198800 | \n", "1.363066 | \n", "
1330 | \n", "2.275300 | \n", "1.355370 | \n", "
1340 | \n", "2.294600 | \n", "1.357998 | \n", "
1350 | \n", "2.264500 | \n", "1.357981 | \n", "
1360 | \n", "2.250900 | \n", "1.354811 | \n", "
1370 | \n", "2.144900 | \n", "1.354355 | \n", "
1380 | \n", "2.133900 | \n", "1.353007 | \n", "
1390 | \n", "2.249000 | \n", "1.357149 | \n", "
1400 | \n", "2.208400 | \n", "1.358126 | \n", "
1410 | \n", "2.190600 | \n", "1.357346 | \n", "
1420 | \n", "2.109200 | \n", "1.358977 | \n", "
1430 | \n", "2.222000 | \n", "1.357165 | \n", "
1440 | \n", "2.125800 | \n", "1.355742 | \n", "
1450 | \n", "2.222800 | \n", "1.354219 | \n", "
1460 | \n", "2.156300 | \n", "1.357865 | \n", "
1470 | \n", "2.117400 | \n", "1.356275 | \n", "
1480 | \n", "2.208300 | \n", "1.362210 | \n", "
1490 | \n", "2.138100 | \n", "1.362097 | \n", "
1500 | \n", "2.147700 | \n", "1.362377 | \n", "
1510 | \n", "2.194500 | \n", "1.362892 | \n", "
1520 | \n", "2.220900 | \n", "1.362169 | \n", "
1530 | \n", "2.117300 | \n", "1.361786 | \n", "
1540 | \n", "2.165600 | \n", "1.361097 | \n", "
1550 | \n", "2.262900 | \n", "1.360748 | \n", "
1560 | \n", "2.125000 | \n", "1.361231 | \n", "
1570 | \n", "2.141500 | \n", "1.361740 | \n", "
1580 | \n", "2.143600 | \n", "1.361741 | \n", "
1590 | \n", "2.096500 | \n", "1.361593 | \n", "
1600 | \n", "2.172000 | \n", "1.361407 | \n", "
1610 | \n", "2.167900 | \n", "1.361214 | \n", "
1620 | \n", "2.096700 | \n", "1.361376 | \n", "
1630 | \n", "2.184200 | \n", "1.361329 | \n", "
1640 | \n", "2.203900 | \n", "1.361158 | \n", "
1650 | \n", "2.191700 | \n", "1.361280 | \n", "
1660 | \n", "2.121000 | \n", "1.361340 | \n", "
1670 | \n", "2.144800 | \n", "1.361310 | \n", "
1680 | \n", "2.096800 | \n", "1.358080 | \n", "
1690 | \n", "2.236200 | \n", "1.355283 | \n", "
1700 | \n", "2.203700 | \n", "1.360003 | \n", "
1710 | \n", "2.217200 | \n", "1.362695 | \n", "
1720 | \n", "2.279700 | \n", "1.364773 | \n", "
1730 | \n", "2.194600 | \n", "1.354236 | \n", "
1740 | \n", "2.199200 | \n", "1.360961 | \n", "
1750 | \n", "2.127900 | \n", "1.361756 | \n", "
1760 | \n", "2.177800 | \n", "1.361055 | \n", "
1770 | \n", "2.155500 | \n", "1.353681 | \n", "
1780 | \n", "2.041900 | \n", "1.357954 | \n", "
1790 | \n", "2.159900 | \n", "1.352807 | \n", "
1800 | \n", "2.145200 | \n", "1.350229 | \n", "
1810 | \n", "2.232900 | \n", "1.349088 | \n", "
1820 | \n", "2.142200 | \n", "1.346589 | \n", "
1830 | \n", "2.147500 | \n", "1.353956 | \n", "
1840 | \n", "2.171600 | \n", "1.348997 | \n", "
1850 | \n", "2.184500 | \n", "1.349901 | \n", "
1860 | \n", "2.143500 | \n", "1.346601 | \n", "
1870 | \n", "2.208400 | \n", "1.347847 | \n", "
1880 | \n", "2.275400 | \n", "1.343300 | \n", "
1890 | \n", "1.979600 | \n", "1.363317 | \n", "
1900 | \n", "2.066200 | \n", "1.359354 | \n", "
1910 | \n", "2.103900 | \n", "1.360669 | \n", "
1920 | \n", "2.063600 | \n", "1.357311 | \n", "
1930 | \n", "2.176900 | \n", "1.352104 | \n", "
1940 | \n", "2.079400 | \n", "1.355342 | \n", "
1950 | \n", "2.077900 | \n", "1.356416 | \n", "
1960 | \n", "2.101500 | \n", "1.353595 | \n", "
1970 | \n", "2.122700 | \n", "1.355783 | \n", "
1980 | \n", "2.153100 | \n", "1.348958 | \n", "
1990 | \n", "2.086500 | \n", "1.354040 | \n", "
2000 | \n", "2.148500 | \n", "1.357381 | \n", "
2010 | \n", "2.164200 | \n", "1.355931 | \n", "
2020 | \n", "2.081600 | \n", "1.352585 | \n", "
2030 | \n", "2.089100 | \n", "1.352731 | \n", "
2040 | \n", "2.162700 | \n", "1.350476 | \n", "
2050 | \n", "2.168000 | \n", "1.348094 | \n", "
2060 | \n", "2.082800 | \n", "1.349910 | \n", "
2070 | \n", "2.102200 | \n", "1.354040 | \n", "
2080 | \n", "2.026700 | \n", "1.352565 | \n", "
2090 | \n", "2.129100 | \n", "1.354204 | \n", "
2100 | \n", "2.047300 | \n", "1.368523 | \n", "
2110 | \n", "2.045100 | \n", "1.357555 | \n", "
2120 | \n", "2.054600 | \n", "1.362660 | \n", "
2130 | \n", "2.125000 | \n", "1.359936 | \n", "
2140 | \n", "2.111100 | \n", "1.359257 | \n", "
2150 | \n", "1.995000 | \n", "1.359139 | \n", "
2160 | \n", "1.994100 | \n", "1.355912 | \n", "
2170 | \n", "2.013300 | \n", "1.358997 | \n", "
2180 | \n", "1.967800 | \n", "1.358274 | \n", "
2190 | \n", "1.948800 | \n", "1.358114 | \n", "
2200 | \n", "1.934900 | \n", "1.357811 | \n", "
2210 | \n", "2.067600 | \n", "1.356543 | \n", "
2220 | \n", "2.142500 | \n", "1.362457 | \n", "
2230 | \n", "2.044100 | \n", "1.356360 | \n", "
2240 | \n", "2.004300 | \n", "1.354497 | \n", "
2250 | \n", "2.046500 | \n", "1.358393 | \n", "
2260 | \n", "1.969000 | \n", "1.361362 | \n", "
2270 | \n", "2.057800 | \n", "1.358096 | \n", "
2280 | \n", "2.018800 | \n", "1.356798 | \n", "
2290 | \n", "2.135100 | \n", "1.357215 | \n", "
2300 | \n", "2.058800 | \n", "1.359850 | \n", "
2310 | \n", "1.993200 | \n", "1.362604 | \n", "
2320 | \n", "2.095200 | \n", "1.364936 | \n", "
2330 | \n", "1.942000 | \n", "1.366520 | \n", "
2340 | \n", "1.954500 | \n", "1.366892 | \n", "
2350 | \n", "2.138000 | \n", "1.366118 | \n", "
2360 | \n", "1.942700 | \n", "1.365882 | \n", "
2370 | \n", "2.000800 | \n", "1.365182 | \n", "
2380 | \n", "2.028400 | \n", "1.365309 | \n", "
2390 | \n", "2.071700 | \n", "1.364362 | \n", "
2400 | \n", "1.862200 | \n", "1.365175 | \n", "
2410 | \n", "1.992300 | \n", "1.366040 | \n", "
2420 | \n", "2.006800 | \n", "1.365316 | \n", "
2430 | \n", "1.987700 | \n", "1.365014 | \n", "
\n", "