SDRTrunk Trunking Recorder help needed for bulk export of 500,000+ calls into Excel

Status
Not open for further replies.

lockersdr

Member
Joined
Nov 15, 2020
Messages
8
Location
Detroit, MI
Currently I have been going page by page and downloading 500 at a time, which is getting monotonous to say the least. I average about 100,000 calls per day and would like to come up with an effective way to pull that data directly into excel without so much manual labor (about 20 minutes per day exporting the last 24 hours of call data)

Any ideas on the best way to go about capturing this data?

Thank you!
 

lwvmobile

DSD-FME
Joined
Apr 26, 2020
Messages
1,296
Location
Lafayette County, FL
Well, can you describe your current method of importing the log files into Excel? Also, if you have both an example spreadsheet that you are using as well as an example log file you are importing from, might make things easier for anybody who might be able to help you out. Perhaps you could look into writing a macro for excel to grab those log files and automate the process. YouTube might be a good place to look into finding out more on that subject.
 

lockersdr

Member
Joined
Nov 15, 2020
Messages
8
Location
Detroit, MI
Current method is clicking "Export to Excel" on the web server. The logfile is sqlite, which I am unversed in. This is for Trunking Recorder. I attempted to connect the sqlite database via power query, just unable to get it to work correctly.
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,409
Location
VA
Excel isn't going to handle sheets with 500,000 rows. It chokes at about 65,000 rows. You're going to need a real database for that many records.
 

lockersdr

Member
Joined
Nov 15, 2020
Messages
8
Location
Detroit, MI
Excel isn't going to handle sheets with 500,000 rows. It chokes at about 65,000 rows. You're going to need a real database for that many records.

That is not a concern of mine, I work with datasets equally as large if not larger without issue each and everyday. I just need to find a way to capture it easier than my current method if anyone has intricate knowledge pertaining to Trunking Recorder.

Thanks!
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,409
Location
VA
If you try to paste 500,000 rows into Excel, it will become a concern, because Excel is limited to 65K rows in a sheet. And that's most likely why you're having problems. Learn SQLite, or do a linked table from Access (there's an ODBC driver you can download to do that) and you can query and process the source data in Access without doing any import/copy/paste crap.
 

a417

Active Member
Joined
Mar 14, 2004
Messages
4,669
Excel & the OSS examples (Libreoffice and such) all handle 1,048,576 @ 16384 rows in their sheets, however they become lethargic & awkward when trying to move massive amounts of data in & out.

Agreed that "Export to Excel" is a frustrating way to do it, but IIRC (other implementations of ) SQLite outputs comma delimited text files, and Excel can import them...maybe that's an easier way? Does Trunking Recorder allow you to dump a comma delimited text file?
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,409
Location
VA
Creating a linked table in Access connecting to the original SQLite data is a much more elegant solution than any export. Access will see the live SQLite data, and can query and process it in ways that would be very cumbersome to do in Excel, if possible at all. Using Excel as a database is always a kludge, and one of the worst possible options if you have more than a few thousand rows of data. Trying to use Excel for 500K+ rows is insanity.
 

a417

Active Member
Joined
Mar 14, 2004
Messages
4,669
Using Excel as a database is always a kludge, and one of the worst possible options if you have more than a few thousand rows of data. Trying to use Excel for 500K+ rows is insanity.
I'm not disputing that at all, Jon...but if @lockersdr needs to continue Excel for his reasons, alternative forms of importing may work.

I agree that once you cross about 1k records in any kind of DB, Excel is a terrible choice. I'm partial to MariaDB myself due to its server-side scripting & XML, which helps one DB server instance handle a wide variety of local services I use daily.
 

lwvmobile

DSD-FME
Joined
Apr 26, 2020
Messages
1,296
Location
Lafayette County, FL
A little python (or any language) programming might do the trick if you really REALLY want to import sqlite database into excel. I'm not really versed on it though, since I don't really have anything to test it out on data wise, but there seems to be a plethora of information on it through google searching, and even example python scripts that you can tailor to your needs. You may not even need something like python, you might issue a command directly to sqlite and have it drop the database in csv format. Not sure how Trunk Recorder's implementation of SQLite works and if you can speak directly to it.

Code:
https://gist.github.com/****almule04/82d2091e2f43cb63029500b56ab7a8cc

https://www.reddit.com/r/Python/comments/2tcopo/how_to_export_sqlite3_db_files_to_excel/

https://stackoverflow.com/questions/18678243/how-to-import-data-from-sqlite-to-excel-using-xlsxwriter-in-python
 

lwvmobile

DSD-FME
Joined
Apr 26, 2020
Messages
1,296
Location
Lafayette County, FL
Well, I might have crafted a little solution. I say the word MIGHT very loosely, becuase I don't have any log data to test it against, let alone 500,000 lines of log. I'm not even sure if the change I made in the js code by itself is enough to facilitate anything, or if more is needed to be done.

tr-more.jpg

Close out of Trunking Recorder and Browse to the folder:
C:\Program Files\Trunking Recorder\Website\js

Right click on TrunkingRecorder.js and click properties, click security tab, highlight Users in the upper box, and click the checkmark box for Modify in the lower box, then click okay.

tr-more2.jpg

tr-more-3.jpg

Now right click on TrunkingRecorder.js again and click edit, and scroll down and find this line and add more values like I have, the original copy should only go up to 500. Then click save and restart Trunking Recorder and go to the webpage for it, see if it works. I honestly don't know if this will work, and it MAY/PROBABLY could brind the browser or computer to its knees trying to parse, I have no idea. If its no good, just scale back the numbers or revert to just up to 500.

tr-more-4.jpg
 

Attachments

  • tr-more-3.jpg
    tr-more-3.jpg
    47.8 KB · Views: 3

lockersdr

Member
Joined
Nov 15, 2020
Messages
8
Location
Detroit, MI
Excel & the OSS examples (Libreoffice and such) all handle 1,048,576 @ 16384 rows in their sheets, however they become lethargic & awkward when trying to move massive amounts of data in & out.

Agreed that "Export to Excel" is a frustrating way to do it, but IIRC (other implementations of ) SQLite outputs comma delimited text files, and Excel can import them...maybe that's an easier way? Does Trunking Recorder allow you to dump a comma delimited text file?

Thank you for bringing me down this path, I ended up exporting the tables I needed into a .csv using DB Browser DB Browser for SQLite and merging them together with a power query with zero issues. My problem was accessing the data as I have limited SQL knowledge (not something that is necessary for what I do) so again, I appreciate helping me connect the dots here. The first query took about 30 seconds, however for each file I add to the database it now takes about 10 seconds to refresh roughly 50,000 line items between 3 .csv tables I have entered.
 

Attachments

  • SQLite_Calls_Query.PNG
    SQLite_Calls_Query.PNG
    436.1 KB · Views: 13

lockersdr

Member
Joined
Nov 15, 2020
Messages
8
Location
Detroit, MI
Well, I might have crafted a little solution. I say the word MIGHT very loosely, becuase I don't have any log data to test it against, let alone 500,000 lines of log. I'm not even sure if the change I made in the js code by itself is enough to facilitate anything, or if more is needed to be done.

View attachment 96923

Close out of Trunking Recorder and Browse to the folder:
C:\Program Files\Trunking Recorder\Website\js

Right click on TrunkingRecorder.js and click properties, click security tab, highlight Users in the upper box, and click the checkmark box for Modify in the lower box, then click okay.

View attachment 96930

View attachment 96932

Now right click on TrunkingRecorder.js again and click edit, and scroll down and find this line and add more values like I have, the original copy should only go up to 500. Then click save and restart Trunking Recorder and go to the webpage for it, see if it works. I honestly don't know if this will work, and it MAY/PROBABLY could brind the browser or computer to its knees trying to parse, I have no idea. If its no good, just scale back the numbers or revert to just up to 500.

View attachment 96933

This is an interesting test, I will give it a shot in a minute and let you know how this turns out. I just posted a solution about 30 seconds ago, and my only issue is I am unable to decipher the time and date group to format it into something I can use. This may be a more straightforward approach. Thank god for 32 gbs of memory.
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,409
Location
VA
I mention Access because if you have Excel, you probably already have Access.
 

a417

Active Member
Joined
Mar 14, 2004
Messages
4,669
I appreciate helping me connect the dots here.
No problem. Been using .csv as a go-between since (gawd) Borland dBASE days when more 'appropriate' methods are unreliable/unsuitable. When in doubt / or a rush , comma delimited for the win!

it's not the best, but it can work.
 

BDavis27707

Member
Joined
Jan 3, 2021
Messages
48
Location
Durham, NC
May have missed this in the thread so far, but you can use Excel Pivot Tables to view and filter your data. I have used it with files containing over 1 million rows.
 

a417

Active Member
Joined
Mar 14, 2004
Messages
4,669
May have missed this in the thread so far, but you can use Excel Pivot Tables to view and filter your data. I have used it with files containing over 1 million rows.
Pivot tables are great for analysis of large data sets, but his issue was getting the data into excel.
 

scannerbox

Member
Joined
Jan 30, 2010
Messages
94
Location
michigan
The Trunking Recorder server enforces the 500 call row limit even if you change the JavaScript so it will still only return 500 rows.
To export 100,000 plus calls today querying the database is probably the best method. I would advise caution when messing with the database directly since there is a chance of causing issues and worst case data corruption/loss if done incorrectly.

The date and time format used in the database is a Unix Epoch timestamp in UTC. (the number of seconds that have elapsed since January 1, 1970). Not sure if Excel can format that into a standard date/time. See Epoch Converter for more info on Epoch time.

If you can share a little more information about what you are trying to do with the data there might be some Trunking Recorder enhancements that could be explored in the future to make it easier or produce the desired information.
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,409
Location
VA
Doing a linked table connection to Access can be set to read-only, and doesn't write to the data table(s) unless you do an update query or open the linked table in read/write mode and intentionally edit a record.
 
Status
Not open for further replies.
Top