Jump to content

Hemo2

Members
  • Posts

    64
  • Joined

  • Last visited

Everything posted by Hemo2

  1. I was working on a pc today that had a SQL Data source defined. I looked in the TNI record for this computer and don't think it logs DSN info, so thought I'd add a request to see if it's possible to gather DSN info from the registry in an update. I believe the list DSN names on a computer are stored in "HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBC.INI\ODBC Data Sources" key. Then for each individual DSN name, the definition/info for each of these is located in a registry key under the "HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBC.INI\" branch and the name of the key matches the name of the DSN retrieved from the "HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBC.INI\ODBC Data Sources" entries. (i.e. - You have a data source named "MyData", so inside the "HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBC.INI\ODBC Data Sources" key, you will find a REG_SZ entry called "MyData" and the definition for the MyData DSN has its' info stored in the "HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBC.INI\MyData" key. If it's not convenient to retrieve the actual definition info for each DSN, it would still be helpful to retrieve the list of DSN names from the "HKEY_LOCAL_MACHINE\SOFTWARE\ODBC\ODBC.INI\ODBC Data Sources" key. That way if you need to rebuild a computer, you'd at least know they had a DSN defined and you could ask to get the actual definition info from someone so you could recreate that DSN.
  2. Sorry, but I have to revive the issue of duplicate records for the same computer because the file names are stored with the MAC address. We get duplicate records for machines because of replacing a motherboard (or NIC), we get them when we buy a new computer to replace an existing one and give the new computer the same name as the old one, but now we are getting them on laptops where a user will be physically plugged into the network with the LAN cable plugged into the ethernet jack on the laptop while they are sitting at their desk and then they have to go to a meeting and take their laptop with them and log in wirelessly, which of course the wireless card has a different MAC address which creates a duplicate inventory file for the same computer.
  3. Sorry for the delayed response. Thank you for adding this! I haven't done a thorough check yet, but initially I thought it wasn't working as I checked an existing asset and it still showed as a desktop. So I rescanned it but it still showed as a desktop. So I deleted all previous snapshots, but still showed as a desktop. I then "reloaded" my TNI Storage, but that didn't help either. I had to delete the record in TNI and then scan it in new and now it shows as a laptop. I'll do some more checking, but will existing laptops that are showing as desktops need to be deleted and rescanned? I see you can set the asset properties manually, but currently with 7,199 assets in my list, that would be a bit of a chore to go through and set these.
  4. A great big THANK YOU for the "/overwrite" switch in the latest update! :)
  5. Wanted to post a quick followup on another reason why we feel having an optional "/overwrite" command line switch available for use with the tniwinagent.exe program. I have login script scanning setup and it stores the resulting inventory files on a shared server. I have spent the better part of 2 days now just simply 'importing' these files. The import process with so many thousands of files is terribly slow. I assume this is due to the import process opening each and every file across the network and having to look at the 'scan date' of the inventory file to determine if it needs to be imported or not. Since a new inventory file is created each time a user logs into their computer, there are many files per computer that the import process needs to deal with. If we could just have a single file for each computer, I suspect the import process would be much faster. We could periodically delete all the inventory files on the shared server, to clean it up, but that would be a never ending task and you could never do it efficiently enough to stay ahead of the numerous logins thousands of users do each day. Plus, we have many different techs and managers that have access to these files and we don't all import them at the exact same time, thus we need to leave the inventory files out on this server permanently to ensure they're always available for anyone to import. I understand why having seperate inventory files created for each login can be useful, but for our situation, a single file getting overwritten with each login scan fits our needs much better.
  6. Hi Benny. I thought they changed to using the MAC due to duplicate comptuer names "across domains"? (i.e. - If you have 2 seperate domains, but by chance had a computer named the same in both domains, giving you two completely different computers that actually exist with the same name.) In TNI v1.6.8 you could only store a single one of these records since the file name was simply the 'computer name', thus only one of these could exist in TNI. Using a naming scheme of "computer_name.your.domain.com" would solve both that problem and my problem. In v1.6.8, when we changed out a motherboard we did not get a 2nd, and duplicate record, since the actual name of the file stayed the same if the MAC addressed changed. For the purposes of 'asset' management, the current naming scheme using the MAC address does not work at all and all the data in your reports are flawed since you end up with 'old' and 'new' (duplicate) records even though only 1 physical computer exists. Basically our scenario is we have a computer named HRLA1111. The resulting file name inside of TNI v2 is HRLA1111_1234567890 (or whatever the MAC address is.) We have this asset inventory record now as part of TNI v2. So now, 5 months down the road, the motherboard dies and we replace it with a new motherboard, which changes the NIC/MAC address. We now end up with a "new" asset file inside of TNI v2 of HRLA1111_ABCDEFGH (or whatever the MAC address is.) But keep in mind, the previous record (file) also still exists in TNI. So we now have 2 physcial records in TNI that show they exist in a particular office, but we have only 1 actual computer. So when you run a report, the data is completely flawed since it doesn't accurately reflect what you actually have. It's going to say we have 2 computers and 2 copies of Microsoft Office, etc., when in fact there is only 1. A similar scenario is when you buy a new computer to replace an existing one, but 'shuffle down' the existing one to a user. The actual computer name remains the same for a particular user, even though the hardware is different. (Our computers names are setup to reflect the office, city, phone number for the user to make it easy for our techs to quickly identify any computer.) So the new computer comes in and gets the existing name of an old computer. That old computer gets shuffled to a different user, getting the same computer name that person had. Now we have 4 files in TNI, but only 2 physical computers actually exist. As it is, we have over 10,300 computers to be inventoried and we are constantly replacing, repairing, shuffling, etc. and the naming scheme in TNI v2 is a nightmare for us because we will very quickly end up with 'many' duplicate entries in TNI. (We already have some.) There is no way with so many computers spread across an entire state that we can manually track every hardware change to know that we need to delete the "old" record in TNI v2 when a simple hardware change occured like replacing a NIC. Plus the fact that we have the Nationwide License so we have TNI installed on multiple nodes so you have to try to manage duplicate records on many different computers. (Basically it's not a workable situation.)
  7. I have been getting our techs converted over to TNI v2 and immediately have received the same few comments initially. I have already mentioned these via private messages but I let our folks know I'd post them again. - I edited all our login scripts and setup the login script scanning with the TNIWINAGENT.EXE program. We already have some users that have logged in many times, creating a new asset file for each login with the 'parentheses' as part of the filename. (Some already have 19 asset files in a few days.) With over 10,000+ computers, you do the math and you can see our concern with the number of files being written to our server that stores the login script scanning files. I have already received questions on how we can control this with only a single inventory file for each computer stored out on our server. Please add an "/overwrite" switch (or some functionality) to the TNIWINAGENT.EXE program so we have the option to only store a single file for each computer regardless of how many times a user has logged in and ran the login script scan. - What are all the command line switches for the tniwinagent.exe program? (It appears that section of the online documentation is under construction.) - I know this is already being considered so I apologize for posting again but I told them I'd post this concern. I've received questions on why we are getting duplicate records after a motherboard or computer was replaced or shuffled down, etc. (This is due to the file names getting stored with the 'mac' address as part of the file name.) We use strict naming on our computers and when a computer gets replaced or shuffled down, or a replacement motherboard was put in, we still use the same computer name, but the NIC was obviously changed, which means the MAC address is different, now creating 'different' asset files for that user. Again, with over 10,000+ computers that continuously get replaced, moved, shuffled, repaired, etc., we need some way to NOT store the MAC address as part of the file name and store files the way they worked in v1.6.8 where is was simply the "computer name". (Or store them as COMPUTER_NAME.your.domain.com) That way just because the NIC got changed, the actual inventory file for that particular user only exists once, instead of ending up with seperate hardware inventory files for the same user, which causes problems with reporting and tracking what computers are installed and what software we have installed in offices.
  8. I'm sure many folks are happy to see this! Thank you for all the hard work! I'm going through some things now. I know I had reported some of these before, but the below computers are being reported as 'desktops' when in fact they should be reported as 'laptops'. HP EliteBook 2530p laptop CPU: Intel Core2 Duo CPU L9400 @ 1.86GHz HP EliteBook 2540p laptop CPU: Intel Core i7 CPU L 640 @ 2.13GHz HP EliteBook 2730p tablet CPU: Intel Core2 Duo CPU L9400 @ 1.86GHz HP EliteBook 2730p tablet CPU: Intel Core2 Duo CPU L9300 @ 1.60GHz HP EliteBook 8540p laptop CPU: Intel Core i7 CPU Q 820 @ 1.73GHz HP EliteBook 8560p laptop CPU: Intel Core i5-2540M CPU @ 2.60GHz HP EliteBook 8460p laptop CPU: Intel Core i5-2520M CPU @ 2.50GHz HP EliteBook 2560p laptop CPU: Intel Core i7-2620M CPU @ 2.70GHz HP EliteBook 2760p tablet CPU: Intel Core i5-2520M CPU @ 2.50GHz
  9. I had sent a 'enhancement' request in via the TNI v2 beta for this, but I'm wondering how others might handle this issue. (I also can't believe I just now thought of this! Too busy doing limited testing to run across the problem I guess, but regardless, this issue is staring me full in the face now!) With TNI v2, the asset file names are in the format of "computer name_mac address". Previously in v1.6.8, it was simply "computer name". We have thousands of computers, and they are constantly replaced with new ones, existing ones get shuffled down to other users, or motherboards get replaced to fix hardware problems, etc. The computer name for each user never changes though. Thus, if a user gets a new computer to replace an old one, the new computer will get named with the same name as the old one. (The old one is either surplused or shuffled down to someone else and in turn gets renamed for that other user.) We have a strict naming scheme for our computers to identify the town, office, and phone number for user to make it easy for all our techs to identify this info at a glance just by looking at the computer name. This creates an pretty big issue with duplicate, and incorrect data, for literally hundreds, and eventually thousands of asset records in TNI v2. This occurs because when a new comptuer gets put in with the old name, it obviously has a "different" mac address, thus the name of the asset file reflects this, but the old file names for the previous computers still exist, thus we now have duplicate records for the same user, which gives us incorrect data in every report. (Anytime the NIC gets changed, the resulting file name will be different, creating duplicate and incorrect data.) Am I missing something obvious on how to reconcile this problem? How are others out there dealing with this problem? Is there a way to "combine" these two different file names into a single asset record? Or is there a way to have TNI use the file naming scheme used in v1.6.8? If the files were named just with the 'computer name', this problem never occured, but now with the new naming scheme in v2, we think our data is going to be significantly flawed. Or maybe the file names can be stored as "computer name_FQDN", such as 'computer.my.domain.com"? Since our domain name is the same, this wouldn't create a new file just because of a hardware change. We have too many computers to even begin to try and 'manually' clean out the thousands of old duplicate records we are going to end up with due to the naming of the files. There are simply too many new computers getting installed, too many shuffle downs, and too many motherboards/NIC getting replaced, so I really hope there is some way to deal with this in v2, as this is a potential game breaker for us!
  10. I noticed with the previous beta's and now the latest of v2.0.0.975 (Mar 17, 2011) that the below computers are incorrectly logged as "Windows desktop" computers, when in fact they should be listed as "Windows laptops". I believe the CPU info is used for determining whether a device is logged as a desktop or laptop, so I have included that information below. HP EliteBook 2530p laptop CPU: Intel Core2 Duo CPU L9400 @ 1.86GHz HP EliteBook 2540p laptop CPU: Intel Core i7 CPU L 640 @ 2.13GHz HP EliteBook 2730p tablet CPU: Intel Core2 Duo CPU L9400 @ 1.86GHz HP EliteBook 2730p tablet CPU: Intel Core2 Duo CPU L9300 @ 1.60GHz HP EliteBook 8540p laptop CPU: Intel Core i7 CPU Q 820 @ 1.73GHz
  11. Thank you Zak. Sorry to keep hounding on this. (Most folks stopped believing me in my organization when I keep saying a new version is coming!) I would be interested in checking out the beta.
  12. I have to ask, is a new version going to be released?
  13. We had a situation where somehow the filenames in our network location where our login scripts store the resulting data files had all their filenames changed from "name.xml" to "name.xml.compressed". The files are definately compressed too as they were around 1,204KB in size are are now less than 90KB in size. What this has done is that when we try to "refresh" our audit tool folder, it doesn't find any data files to refresh, since there aren't any in .xml format anymore. I suspect that somehow someone inside the program set their "Storage" folder location to the network folder instead of setting it to what should be a 'local' folder on their hard drive. And then under the Storage options, the option to "Compress previous folder on changing of folder" is checked and then this person change the location to a new folder and now the previous folder, which is actually our network location got compressed. Couple of questions: - Is there a way to "change back", or convert these now '.compressed' files back into normal .xml files? I assume there is and I'm just overlooking it. - Could perhaps the new version of the program not have the option to compress set as the default? I'm just thinking ahead here that if this is what caused our issue, it might be nice to not have the option to compress files preset as the default option in TNI.
  14. Daron, we set our delay to 5 minutes to ensure the various processes and services have finished loading on our computers before tniaudit runs. (Some of the computers are older and slower.) One thing we noticed is that if the user logs in, but then reboots or logs off before the delay finishes and tniaudit runs, they will get a message that a process is still in running (tniaudit) and get prompted to terminate that process instead of simply logging off like you'd expect. You may want to test if that works the same way for you. It would be nice if this didn't happen and tniaudit would just close without incident. It's not a huge deal though, but is something to inform your users about if you find this occurs for you.
  15. Thank you for the follow up reply Zak. We definately would prefer you work on the next version, than look into issues in the current version! I've fielded a lot of complaints and comments on the performance & this error along with questions about when the next version will be available. I'll keep telling them I'm not sure when the next version might be released and to be patient. Thanks.
  16. I know it hasn't been a few months quite yet from when I reported this, but the access violations are becoming quite frustrating. I've had multiple techs and supervisor staff complain about it to me as we continue to encounter them. Unfortunately the events.log file isn't getting updated with any error info, or any other info, for that matter when this happens. Today, I've encountered the error when refreshing the 'audit tool' folder, which is a storage folder on a local server. I re-ran the audit refresh and then it went through a second time and was able to refresh without problem. However, I then tried to pull up a custom report on our computers as we need to identify what version of antivirus is installed everywhere for an ongoing project we have. The first time, I got the access violation within 10 minutes of starting the report process. I then restarted and after 1 hour and 45 minutes of the report scanning through the data records, it bombed with another access violation and I need to start over again. So after the better part of an entire work day trying to update info and get a report, I haven't completed the task yet. Sorry, but just had to vent and again pass on the request from all our techs and admin staff for the hope of a timely release of the next version. I don't know that the next version is the magic fix for this, but our folks are pretty frustrated at this point and we're hoping that if nothing else, the significantly smaller data file sizes in the next version will alleviate our field staff concerns about bandwidth and we can do login script scanning state wide for remote sites. (And that the smaller files also help speed up the reporting process too!)
  17. I have TNI v1.6.8 installed on my Windows 7 Professional 32-bit workstation that has 4GB of RAM. Periodcially when I run the 'Refresh audit tool folder', at some point during the refresh as it's reading files, I will get the below error. (I can't say the numbers in the error are the same everytime since I have't previously written them down.) "Error: Access violation at 0x00432F6A (tried to write to 0x00030FFC), program terminated." After I click OK on the error, then TNI will terminate and I have to start all over. As I said, it doesn't do it everytime, but enough that it is an annoyance. Other techs have reported this to me also. In some cases, I've just had the techs manually copy the files from the audit tool folder down to their local data folder and then tell them to refresh the data storage folder. We do have over 6,800 data files stored in our network audit tool folder. I checked file permissions, and I have "full" control permissions for the folder that TNI is installed in, the local data folder, and also the audit tool folder. I checked the "events.log" file, but it isn't getting updated with any errors or messages and in fact is a few weeks old. Is this something that other folks have run into? What might be causing this? Do we perhaps have too many files in our audit folder? Or maybe because of the number of files and the amount of data it has to read/copy over, that may be causing the issue? Which leads me to my next question regarding the next version of TNI. The next version is supposed to have greatly reduced data file sizes. I've had multiple comments about how slow TNI is to do a refresh and also our field sites refuse to do login script scans to a centralized location due to bandwidth concerns, so I'm wondering if the new version of TNI will be arriving soon?
  18. That's what I suspected. Thanks for the followup.
  19. I had one of our support teams ask if you can include the NIC driver "version" in a report. (We've had some issues with a few cards with certain versions of drivers.) I looked through and I don't think this is an item that TNI collects, but I wanted to ask to make sure. Thanks.
  20. Thanks for the reply. I think the confusion on SEP was on our part. We had someone create a custom report with the 'antivirus version' field as the only field for AV stuff. With Vista/Win7, Microsoft apparently doesn't report back this info to the security center anymore, thus it appeared that TNI was showing nothing. Adding the 'antivirus name' and 'antivirus status' does show info for SEP however. It would be nice if the AV version could somehow be included in this report though, but I think I understand why it isn't and where the confusion on our part came.
  21. Zak, we have begun our migration from XP to Windows 7. (We have over 8,000 workstations.) As part of this, we're upgrading our antivirus product from Symantec AntiVirus to Symantec EndPoint Protection. I know the current version of TNI doesn't report the Symantec EndPoint Protection correctly under Windows 7. This topic was brought up to me and I thought I'd ask that since it appears the next version of TNI is a way off from being released, if there's any possibility that the current version can be updated to properly report the status of SEP under the Windows 7 operating system? Also, I had a question about the "Windows 7 Ready" item in the custom report and what hardware minimums the program is using to base the "Yes" or "No" answer on. Can you tell me what TNI uses for the hardware requirements to report on this? Also, is there anyway we can specify our own requirements for TNI to use to base its' answer to this on? As a side note, we're upgrading from Office XP to Office 2010 as part of this. If I recall, TNI doesn't report the product key for Office 2010. This isn't such a big deal for us since we are using the "KMS Licensing" model for licensing our Office 2010. I guess I'm just curious how the next version of TNI will report the licensing or product key for Office 2010 when you are using a KMS license server, and don't actually have a product key code entered in. Thanks.
  22. Thank you for the detailed reply. This is very good information and exactly what I needed!
  23. I have a few questions about the flow of information when doing scans. (More specifically the amount of data that is transfered.) When I do an immediate online scan, and it installs the remote service and does the remote scan, what all is getting transferred and copied back and forth? Here's my understanding of it as it occurs simply by observing the process. - I run TNI and do an immediate online scan and select multiple computers to scan. - TNI then contacts the remote computer and copies a file and remotely installs a service on the remote pc to initiate the scan. - The remote process scans and finishes and reports back the info to my computer that I'm running the scan on and then it removes itself from the remote pc. My questions are: - What file gets copied to the remote pc? Is it tniaudit.exe? - During the scan when I'm sitting at my computer running the immediate online scan, TNI will provide updates such as "copying file", "scanning (software)", etc. How often is TNI contacting the remote computer to get these updates and is this an amout of data of any size? - Once the remote scan finishes and sends back the data to my pc, how is this accomplished? Does the remote tniaudit.exe create the .XML file and then transfers that back to TNI, or is 'raw data' sent back to my pc and then TNI creates the actual .XML file? - Is the data (or .XML file) that gets sent back to my computer "compressed"? If so, about how large is it? I'm asking these questions because we're trying to get a feel for 'how much' data is being transferred across the network. We have many remote networks all across our state and bandwidth is an extreme problem and concern for us. The .XML files are about 1.3MB in size. The tniaudit.exe file is 256KB in size. So using simple math, I'm wondering if the amount of data transferred is around 1.56MB? But if there are multiple communications between TNI and the remote computer during the scan, that probably adds at least some to that total to scan a single computer. When my managers and network admins ask me how much data is being transferred to do a scan, I'm hoping to get a feel for what's going on so I can give them a fairly accurate answer, as we have thousands of computers on remote networks going across slower WAN connections and we don't want to interfere with daily operations and choke our WAN. Sorry, one more question. When doing a "login script" scan, the tniaudit.exe file gets copied down to the local pc and runs and then it copies the resulting .XML file to the location specified. I assume since the remote tniaudit.exe is the process performing the copy of the .XML back to the audit folder, that there is no compression of the .XML file, and the total data transferred back & forth would be around that 1.56MB, which encompasses the tniaudit.exe being copied and the .XML file being copied to the audit folder. Is that about the amount of data transferred when doing a login script scan? Thank you.
×
×
  • Create New...