Tango Archiving

Hello Balkrishna,

bchitalia2
Hi Katy

1. Could you please confirm from Raphael and guide me how to archive tango attributes in HDB at a rate of 1 second ?

2. I have read the Mambo manual and i am unable to find where it is written that "till what period TDB stores the archived data". So please could you confirm and tell.
HDB can't be lower than 10 s
TDB can't be lower than 100 ms

So if you want to test HDB first without a fast rate.
Test with 10s and then try to extract through a VC.

If you want to have a rate of 1s you can try with TDB temporary database. (The inconvenient is that the data will not kept as longer as historical)
When you create an new AC, check the Temporary AC checkbox. In this mode you will be able to set a 1s period.
You have to select your attribute in the tree and click on Set Button. It will set the attribute in bold font in the tree.

bchitalia2
3. VC is used for monitoring the values of an attributes after starting archiving. For the tango attributes i have archived in HDB, i can see the archived attributes list and it's value gets updated after every 10 seconds ( graphs etc ). But for TDB after starting archiving, I can see the archived attributes list in VC but its value doesn't get updated after clicking on refresh. So please could you tell whats the reason ?

Please check, the console at bottom of Mambo.
Is the connexion of TDB database works.
Is the attribute is OK ? (But if is not you must have some NULL in the database).
Normally, for TDB the attributes are first stored in a file define in the TDBArchiver DbPath property
Then the datas are exported to the database each ExportPeriod property (in ms) of the TDBArchiver (can be defined at the class level).
Perhaps, it is means that your data are not exported yet in the TDB database. So check your data file.
Is the Bulb is brown ? (It's mean that the attribute is archived in TDB)

bchitalia2
4. As you said there are other ways to extract an data: They are
a.) You have to click on ExtractBetweenDates command in jive and write the argument.
but whenever i write the in the format mentioned by you in the above reply, it says wither attribute not found or the argument is invalid. Maybe somewhere i am doing mistake in writing argument. So could you please guide me. Suppose if attribute is Speed , Start date is : 26-05-2015 08:08:08 , End date is : 27-05:2015 15:30:30 How will you write the argument ?

b.) Facing same issue with GetAttDataLastN

First check if youre attribute is well archived, in executing command GetCurrentArchivedAtt from the ExtractorDevice.

Then here is a working arguments :
GetAttDataLastN "ANS-C09/VI/CALC-JPEN.1/mean","20" it will create a dynamic attribute that you can read after. The response of the command is the name of the new attribute and also the length.

ExtractBetweenDates "ANS-C09/VI/CALC-JPEN.1/mean","2015-05-27 18:08:00.184","2015-05-28 18:08:00.184"
You miss the millisecond in your sample…The avantage, is that the answer is the values.

bchitalia2
5. DataBrowserUserManual.pdf is not mentioned in the doc folder of the Archiving Root. So could please provide me the link to download.

The databrowser is not in the ARCHIVING package. You will find the last distribution here http://wiki.synchrotron-soleil.fr/tikiwiki/tiki-index.php?page=How-browse-dowloaded-Nexus-files
This software is used at SOLEIL to read Nexus or HDF files, TANGO data and Archiving datas. You can see the different sources in the same Plot.

bchitalia2
6. In different machine I have to extract the tango attributes from HDB. How could i achieve it ? You have given the reply above to this question but could you please elaborate as I am unable to do that.

When you say in different machine, you mean different Tango Database ? If it is so. I missunderstand you question. Could you explain me, the architecture .. Several Tango database ?, one Archiving database ? …

bchitalia2
7. Also in Bensikin tool, is there any way that snapshots can be uploaded to snapshot panel at regular interval rather than clicking on launch snapshot button again and again. I need snapshot to be uploaded to snapshot panel at a rate of 1 second. How can I achieve it here?
Thanks and Best Regards

Balkrishna

In Bensikin, there is no Snapshot automation. So today, you can use the SnapShotManager and use the command LaunchSnapShot with the context Id in argument.

We are working on AlarmTool (included in the Archiving package) to launch a Snapshot automatically on when Alarm event occured.

If you want to submit some Archiving evolve do not hesitate to do it in Tango mediatracker or asked it directly to Raphaël Girardot.

Best Regards

Katy

Hi Katy,

Thanks for the reply and solutions.

I have some more queries,

HDB can't be lower than 10 s
Is there any workaround to get fast rate of 1s in Hdb

TDB can't be lower than 100 ms

If you want to have a rate of 1s you can try with TDB temporary database. (The inconvenient is that the data will not kept as longer as historical)

I read in some ppt it stores for 30 days ,can we customize this ?

3. VC is used for monitoring the values of an attributes after starting archiving. For the tango attributes i have archived in HDB, i can see the archived attributes list and it's value gets updated after every 10 seconds ( graphs etc ). But for TDB after starting archiving, I can see the archived attributes list in VC but its value doesn't get updated after clicking on refresh. So please could you tell whats the reason ?

Please check, the console at bottom of Mambo.
Is the connexion of TDB database works.
Is the attribute is OK ? (But if is not you must have some NULL in the database).
Normally, for TDB the attributes are first stored in a file define in the TDBArchiver DbPath property
Then the datas are exported to the database each ExportPeriod property (in ms) of the TDBArchiver (can be defined at the class level).
Perhaps, it is means that your data are not exported yet in the TDB database. So check your data file.
Is the Bulb is brown ? (It's mean that the attribute is archived in TDB)

Regarding my issue with view configuration part in TDB,

Console at bottom of Mambo reads:

29-05-15 10:34:34.219 - DEBUG: Archiving successfully started
29-05-15 10:34:44.992 - INFO : extract from DB for p/q/r/WindSpeed took 49 ms
29-05-15 10:34:45.006 - INFO : extract from DB for p/q/r/Speed took 11 ms
29-05-15 10:34:45.016 - INFO : extract from DB for p/q/r/State took 9 ms
29-05-15 10:34:45.033 - INFO : extract from DB for p/q/r/Status took 16 ms
29-05-15 10:34:50.070 - INFO : extract from DB for p/q/r/WindSpeed took 25 ms

So no error here.
Also Tdb connection is ok , attribute is fine and data is also archived in TDB as indicated by brown bulb.
But when i do transfer to VC, attribute list is coming but no data updated (no graphs) but same thing i do for HDB , everything is perfectly fine.

bchitalia2
6. In different machine I have to extract the tango attributes from HDB. How could i achieve it ? You have given the reply above to this question but could you please elaborate as I am unable to do that.
When you say in different machine, you mean different Tango Database ? If it is so. I missunderstand you question. Could you explain me, the architecture .. Several Tango database ?, one Archiving database ? …

Yes ,I am using different Tango databases and one archiving database.Tried using comma separated host entries in TANGO_HOST variable. Only the first tango database host in the TANGO_HOST list and its devices get list in the mambo AC, unable to get other hosts and their devices.

DataBrowser GUI
This is the error coming in DataBrowser GUI.

*****************************************

Jun 01, 2015 3:43:47 PM fr.soleil.data.service.LoggingSystemDelegate log
SEVERE: Failed to transmit p/q/r/speed history data to targets
java.lang.ClassCastException: [[D cannot be cast to fr.soleil.data.container.matrix.AbstractNumberMatrix
at fr.soleil.data.controller.NumberMatrixController$WildcardNumberAdapter.adaptSourceData(NumberMatrixController.java:78)
at fr.soleil.data.adapter.source.DataSourceAdapter.getData(DataSourceAdapter.java:47)
at fr.soleil.data.controller.DataTargetController.transmitDataToTarget(DataTargetController.java:115)
at fr.soleil.data.mediator.AbstractController.transmitSourceChange(AbstractController.java:542)
at fr.soleil.data.source.AbstractDataSource.notifyMediators(AbstractDataSource.java:106)
at fr.soleil.data.source.BufferedDataSource.updateData(BufferedDataSource.java:79)
at fr.soleil.data.source.BufferedDataSource.updateData(BufferedDataSource.java:70)
at fr.soleil.data.service.thread.DataSourceRefreshingThread.refreshData(DataSourceRefreshingThread.java:47)
at fr.soleil.data.service.thread.AbstractRefreshingThread.run(AbstractRefreshingThread.java:55)

[[D cannot be cast to fr.soleil.data.container.matrix.AbstractNumberMatrix
*****************************************

*****************************************
Jun 02, 2015 12:20:33 PM fr.soleil.data.service.LoggingSystemDelegate log
SEVERE: Failed to transmit sys/tg_test/1/wave history data to targets
fr.soleil.data.exception.DataAdaptationException: fr.soleil.comete.definition.data.adapter.ObjectToStringMapAdapter can't adapt data of class java.util.ArrayList
at fr.soleil.comete.definition.data.adapter.ObjectToStringMapAdapter.generateDefaultException(ObjectToStringMapAdapter.java:386)
at fr.soleil.comete.definition.data.adapter.ObjectToStringMapAdapter.extractArraysFromUnusualData(ObjectToStringMapAdapter.java:369)
at fr.soleil.comete.definition.data.adapter.ObjectToStringMapAdapter.extractArrays(ObjectToStringMapAdapter.java:225)
at fr.soleil.comete.definition.data.adapter.ObjectToStringMapAdapter.adapt(ObjectToStringMapAdapter.java:95)
at fr.soleil.comete.definition.data.adapter.ObjectToStringMapAdapter.adapt(ObjectToStringMapAdapter.java:28)
at fr.soleil.data.adapter.source.DataSourceAdapter.adaptSourceData(DataSourceAdapter.java:69)
at fr.soleil.comete.definition.data.adapter.DataArrayAdapter.adaptSourceData(DataArrayAdapter.java:60)
at fr.soleil.comete.definition.data.adapter.DataArrayAdapter.adaptSourceData(DataArrayAdapter.java:24)
at fr.soleil.data.adapter.source.DataSourceAdapter.getData(DataSourceAdapter.java:47)
at fr.soleil.data.controller.DataTargetController.transmitDataToTarget(DataTargetController.java:115)
at fr.soleil.data.mediator.AbstractController.transmitSourceChange(AbstractController.java:542)
at fr.soleil.data.source.AbstractDataSource.notifyMediators(AbstractDataSource.java:106)
at fr.soleil.data.source.MultiDataSource$FakeMediator.transmitSourceChange(MultiDataSource.java:222)
at fr.soleil.data.source.AbstractDataSource.notifyMediators(AbstractDataSource.java:106)
at fr.soleil.data.source.BufferedDataSource.updateData(BufferedDataSource.java:79)
at fr.soleil.data.source.BufferedDataSource.updateData(BufferedDataSource.java:70)
at fr.soleil.data.service.thread.DataSourceRefreshingThread.refreshData(DataSourceRefreshingThread.java:47)
at fr.soleil.data.service.thread.AbstractRefreshingThread.run(AbstractRefreshingThread.java:55)
fr.soleil.comete.definition.data.adapter.ObjectToStringMapAdapter can't adapt data of class java.util.ArrayList
*****************************************
Thanks.
Balkrishna
Edited 8 years ago
Hi Katy

Please reply to my above queries.

Thanks
Balkrishna
Edited 8 years ago
Hello,

I let my colleague Raphaël Girardot who is in charge of Archiving answer you.

Best regards,

Katy
Hello Balkrishna

I will answer your questions
Is there any workaround to get fast rate of 1s in Hdb?
As a matter of fact, there is a workaround. But first, you have to understand that HDB is expected to store data for ever, which is why it was decided to put this kind of limit. DB administrator won't like to have their DB storage growing too fast, and users have to ask themselves whether it is really useful to store forever some data at higher frequence rate than every 10s. That being said, the workaround is to use the property "shortPeriodAttributes" in the HdbArchiver class. This property must be written that way:
"Attribute complete name,minimum period in seconds".
For example:
"
tango/tangotest/1/short_scalar_ro,2
tango/tangotest/1/double_scalar_ro,1
"
So, for these attributes, you define the maximum archiving frequency, which can't be higher than 1Hz (every second) (and no, there is no further workaround for this limit)


I read in some ppt it stores for 30 days ,can we customize this ?
Yes, you can and in fact you MUST. Let me explain :
TDB cleaning is done by TdbCleaner, which is for now only available for linux.
This cleaner should be registered in crontab, to be regularly executed (This is the part for which I wrote "you MUST").
To know which data is considered as too old and must be cleaned, TdbCleaner reads the property "RetentionPeriod" in TdbArchiver class. This property must be filled that way: "time unit/value", where time unit can be "minutes", "hours" or "days", and value a strictly positive integer. If not this property is not filled, the default value is used (3 days, which represents "days/3"). So, data older than RetentionPeriod will be deleted by TdbCleaner evry time it is executed.

Also Tdb connection is ok , attribute is fine and data is also archived in TDB as indicated by brown bulb.
But when i do transfer to VC, attribute list is coming but no data updated (no graphs) but same thing i do for HDB , everything is perfectly fine.
This may be because your data was not exported to database yet. TDB does not work exactly the same way as HDB. As TDB accepts a higher archiving rate than HDB, data are first writen in files by archiver (the deletion of these files must be done by administrator, using for example a script registered in crontab), before being exported to database. To ensure viewing your data as soon as possible in mambo, you have to check some option: In Mambo, go to "Tools/Options". There, go to "VCs" tab and select "yes" in "Force TDB export on View". This will take more time to view your data, as before data extraction, mambo will ask TdbArchivers to export their data to TDB instead of letting them doing this automatically at their own rate.
For HDB, data are directly written in database.

Yes ,I am using different Tango databases and one archiving database.Tried using comma separated host entries in TANGO_HOST variable. Only the first tango database host in the TANGO_HOST list and its devices get list in the mambo AC, unable to get other hosts and their devices.
Mambo was not designed to work with multiple tango hosts at the same time. The coma separator is more likely interpreted to consider that if the first one does'nt answer, try second one.


This is the error coming in DataBrowser GUI. […]
Well, this is a bug. We will check it and try to find a correction.

Regards,

Raphaël Girardot
Rg
Edited 8 years ago
Hi Rapheal/Katy

Thanks for the reply to above queries.

As a matter of fact, there is a workaround. But first, you have to understand that HDB is expected to store data for ever, which is why it was decided to put this kind of limit. DB administrator won't like to have their DB storage growing too fast, and users have to ask themselves whether it is really useful to store forever some data at higher frequence rate than every 10s. That being said, the workaround is to use the property "shortPeriodAttributes" in the HdbArchiver class. This property must be written that way:
"Attribute complete name,minimum period in seconds".
For example:
"
tango/tangotest/1/short_scalar_ro,2
tango/tangotest/1/double_scalar_ro,1
"
So, for these attributes, you define the maximum archiving frequency, which can't be higher than 1Hz (every second) (and no, there is no further workaround for this limit)

Thanks for the fix. It worked.

This may be because your data was not exported to database yet. TDB does not work exactly the same way as HDB. As TDB accepts a higher archiving rate than HDB, data are first writen in files by archiver (the deletion of these files must be done by administrator, using for example a script registered in crontab), before being exported to database. To ensure viewing your data as soon as possible in mambo, you have to check some option: In Mambo, go to "Tools/Options". There, go to "VCs" tab and select "yes" in "Force TDB export on View". This will take more time to view your data, as before data extraction, mambo will ask TdbArchivers to export their data to TDB instead of letting them doing this automatically at their own rate.
For HDB, data are directly written in database.

Selected "Yes" in "Force TDB export on view", still not able to view the data. Anyother workaround ??

Yes, you can and in fact you MUST. Let me explain :
TDB cleaning is done by TdbCleaner, which is for now only available for linux.
This cleaner should be registered in crontab, to be regularly executed (This is the part for which I wrote "you MUST").
To know which data is considered as too old and must be cleaned, TdbCleaner reads the property "RetentionPeriod" in TdbArchiver class. This property must be filled that way: "time unit/value", where time unit can be "minutes", "hours" or "days", and value a strictly positive integer. If not this property is not filled, the default value is used (3 days, which represents "days/3"). So, data older than RetentionPeriod will be deleted by TdbCleaner evry time it is executed.

This is a very basic question as i don't know anything about crontab. How to register TdbCleaner in crontab ?

Mambo was not designed to work with multiple tango hosts at the same time. The coma separator is more likely interpreted to consider that if the first one doesn't answer, try second one.

Well, this is a bug. We will check it and try to find a correction.

Ok got it. Thanks for the reply.

Here are my few other queries:

Following error is coming while using alarmtool GUI

1. Rule creation error
Cannot set rule to database
Unknown column 'custom' in 'field list.

2. Rule creation error
Cannot set rule to database
Unknown column 'textTalkerEnabled' in 'field list.

3 ./ AlarmManager 1

This read at the bottom of console
INFO Thread-12-f.s.a.a.a.i.s.i.LocalManager.registerNewArchivers:105 - Register Archiver archiving/alarmdb/alarmdbarchiver.01_01

Do i have to register alarm archiver somewhere ? and if yes where ?
4. In Databrowser application, is it possible to see the values of previously archived data ?

Thanks and Best Regards
Balkrishna

Edited 8 years ago
Hello Balkrishna

I will answer you for the archiving part, and let Katy answer for the rest.

Selected "Yes" in "Force TDB export on view", still not able to view the data. Anyother workaround ??
This probably means your TDB archivers do not even write the files. So first, check where they write files. For that, take a look at their "DbPath" and "DsPath" properties. Maybe the folder path is not good.

This is a very basic question as i don't know anything about crontab. How to register TdbCleaner in crontab ?
First, take a look at these webpages:
- HowTo: Add Jobs To cron Under Linux or UNIX?
- How to schedule a task using Linux crontab (/etc/crontab) file
Here, at SOLEIL, we decided to edit /etc/crontab file for this kind of case

Regards,

Raphaël Girardot
Rg
Edited 8 years ago
Hello Balkrishna,

Before I answer about DataBrowser and AlarmTool and let Raphaël answer about Archiving tools (Mambo). Let I remind you the global architecture. Because the questions comes all at the same level, but each item Alarm, AlarmDb, Archiving DB and DataBrowser are not all in the same project. By the way it would be easier to answer if you post your question about DataBrowser and AlarmTool in a separated topics…

* So, first I have already answer to Drea at the beginning of the Topics about HDB, TDB and SNAP. So those 3 projects are included in Archiving package that you have dowloaded. Raphaël is in charge of this package.
The ADB that come with MACARENA is give up and replaced AlarmTool project that still managed by me for the moment, I will give the project to Raphaël very soon. But we have a very busy, so I don't have the time yet to transfer the project.

* Then AlarmTool, as I said before, it is still on my responsability. And as I said to sudeep in the topics about it in the topics about it, you can dowload the project on this link. The futur Archiving will be fixed with my modifications that comes from Sudeep comments. So would you try this package and read the doc folder to know How to Install AlarmTool.

* At the end DataBrowser project is a completely different tools, that allows you to browser any data.(Tango, Nexus, HD5, Archiving). So yes you could use to read archived data from HDB or TDB.
For that as I told you in the same topics here. You have to configured the database connexion through the databrowser script delivered in the package.

* I have question in return, you have a Exception when you trying to open a tango attributes (wave..or else)
Could you explain me, what are you trying to open, in order that I can reproduce the error on my computer.
I will try to fixe your problem. It will help me if you send me screenshot. And could you post your problem in a separated topic about DataBrowser.

Thank you very much. I hope it will help you.

Katy






Hi Katy/Rapheal

This probably means your TDB archivers do not even write the files. So first, check where they write files. For that, take a look at their "DbPath" and "DsPath" properties. Maybe the folder path is not good.

DbPath properly defined. Still not working.

First, take a look at these webpages:
- HowTo: Add Jobs To cron Under Linux or UNIX?
- How to schedule a task using Linux crontab (/etc/crontab) file
Here, at SOLEIL, we decided to edit /etc/crontab file for this kind of case

Thankyou for the inputs.

Then AlarmTool, as I said before, it is still on my responsibility. And as I said to sudeep in the topics about it in the topics about it, you can download the project on this link. The future Archiving will be fixed with my modifications that comes from Sudeep comments. So would you try this package and read the doc folder to know How to Install AlarmTool.

Thankyou for the inputs. Alarm Tool running perfectly fine. Just want to add in the AlarmtoolGUISoleil.sh

Replace java $JAVA_OPTIONS -cp $CLASSPATH -Duser.language=US -Dmanager_mode=device -Djava.ext.dirs=$CLASSPATH fr.soleil.archiving.alarm.gui.AlarmTools -workdir $ALARMTOOL_ROOT\configuration\workmysql
with
java $JAVA_OPTIONS -cp $CLASSPATH -Duser.language=US -Dmanager_mode=device -Djava.ext.dirs=$CLASSPATH fr.soleil.archiving.alarm.gui.AlarmTools -workdir $ALARMTOOL_ROOT/configuration/workmysql

I have question in return, you have a Exception when you trying to open a tango attributes (wave..or else)
Could you explain me, what are you trying to open, in order that I can reproduce the error on my computer.
I will try to fixe your problem. It will help me if you send me screenshot. And could you post your problem in a separated topic about DataBrowser.

I was not able to set the rule in alarmdatabase initially. Now i am able to do so with the latest version you provided.
Also I will create different thread for discussion of databrowser and alarm if i came across any doubts.
Thanks and Best Regards
Balkrishna
Hi Rapheal

I have one more query

As a matter of fact, there is a workaround. But first, you have to understand that HDB is expected to store data for ever, which is why it was decided to put this kind of limit. DB administrator won't like to have their DB storage growing too fast, and users have to ask themselves whether it is really useful to store forever some data at higher frequence rate than every 10s. That being said, the workaround is to use the property "shortPeriodAttributes" in the HdbArchiver class. This property must be written that way:
"Attribute complete name,minimum period in seconds".
For example:
"
tango/tangotest/1/short_scalar_ro,2
tango/tangotest/1/double_scalar_ro,1
"
So, for these attributes, you define the maximum archiving frequency, which can't be higher than 1Hz (every second) (and no, there is no further workaround for this limit)

I tried this method, its running perfectly fine. Just one more request, I have around 100 attributes to be archived at 1s in HDB.
So do i have to type all attributes in the HdbArchiver class property. I mean like
For ex:
1."tango/tangotest/1/short_scalar_ro,1
2. tango/tangotest/1/double_scalar_ro,1
….
….
….
100.tango/tangotest/1/boolean,1"

Isn't there any shortcut method to do so ? Just asked for curiosity, if it can be done.

Thanks and Best Regards

Balkrishna
 
Register or login to create to post a reply.