CrushFTP Support Forums
  [Search] Search   [Recent Topics] Recent Topics   [Hottest Topics] Hottest Topics   [Members]  Member Listing   [Groups] Back to home page 
[Register] Register / 
[Login] Login 
Messages posted by: spinkb
Forum Index » Profile for spinkb » Messages posted by spinkb
Author Message
The login system in 6.5 and 7.2 is the same. So what worked in 6.5 will work in 7.2.

But 7.2 has more features on the login page...so you will be missing those if you keep your old login page. But it should still work just fine.

MagicDir and such haven't changed.

Thanks,
Ben
Remove the windows service, and then reinstall the windows service.

Does the service work then?

Thanks,
Ben
To match that dir structure, CrushTask has to be used.

Send us an email at support and I'll give you a user.xml file that has the event already created.

You just need to update the paths in it to point to your folder location and it will do exactly what you are looking for.

Thanks,
Ben
The folder name can't be their username because it would cause a collision of two folders at the same level having the same name....how would they be able to distinguish which was which?

It is possible, but creating the VFS item with a custom name would have to be done through a login event and crush task configured item. This could make the VFS item pointer and VFS.XML file with the permissions on demand at login.

Thanks,
Ben
In the User Manager, if your not using hashing, double clicking on the label Password: next to the field will reveal the password. If its hashed though, it can't be revealed.

For the scenario you described, you could simply make this config on the default user with a VFS item referencing a variable.

file://somepath/to/folder/Receive/%username%/

and another item

file://somepath/to/folder/Deposit/%username%/

And name one Receive, and the other Deposit. user logs in,s is both folders, chooses which to go into and that is it.

All users can link to this common VFS. This doesn't create the folders for you, but allows for the general pointing towards the folders. If you needed the folders created automatically too, you could have a login event referenced as well that generated the folders. You would only need to do this one time, and allow users to be inheriting this setting automatically.

Thanks,
Ben
Your notify email for this forum is incorrect and the email is bouncing...

I'll reply back here in a little bit.
This capability is in all versions.

Thanks,
Ben
The "Server's Files" is your view as an admin of the server so you can drag items over to the user to grant access. It has nothing to do with the user account, and it doesn't change as you move between users as it would be really annoying to have to keep browsing down to a folder where you want to keep adding items to different users.

Its working exactly as its supposed to work there. Its the right hand side that is the Virtual File System (VFS) and that is what you need to be looking at. That is what a user will see. Server's files is just your file browser for choosing items to add.

Thanks,
Ben
I don't really understand the question unfortunately...

Limited admins can only be admins over a single group. It takes another username to be a limited admin over another group.

Thanks,
Ben
Rick,

There isn't a way to solve this. FTP had the concept of FXP, but most firewalls block it today, it died with the advent of firewalls. It was popular back in the 90s.

For your scenario you should have two FTP servers, your clients each use a DNS name of "myserver.domain.com" but you resolve that to the proper LAN IP that is closest to them with your DNS server.

Then each ever also has a folder for the opposite server if you need to give them access. This way they aren't connecting direct to the opposite server (the could if you wanted them too).

But that is the only way to get speed in the transfers.

Thanks,
Ben
Use 04:00 and not 4:00

The other option is a way to change the offset in hours for the filename it creates.

Thanks,
Ben
Right next to user.xml and vfs.xml there is also a VFS folder. This folder contains XML files without a .XML extension.

These are the fields that have the references to your dir paths. These are what you change.

I would do this using TextWrangler on OS X or Notepad++ on windows.

Thanks,
Ben
Can you email support directly and provide your login page URL so we can see what it does for us?

Thanks,
Ben
Chang your event to be on disconnect, that alone may do it...

Or change to on disconnect, copy to another folder, forget all items, find the 1 item in the folder, then process it.

Thanks,
Ben
You could take the items uploaded, copy to a temp folder, then exclude all the items in the list, and find the current items in the temp folder, move them back to the user's home folder, and then process normally. Doing this would wipe away the info about the multiple uploads.

Thanks,
Ben
Could you email us directly at support? I would like to do a screen sharing session with you so we can see what is going on that could be using so much memory.

Let me know,
Ben
On the user doing the share, enable the second row items for them for when users they share to do an action.

When they share now, an event copy is made of this so that when the share logs in, it takes the action there.

Then monitor the CrushFTP log to see what its showing.

Thanks,
Ben
Click update now and this is now fixed.

Thanks,
Ben
You do need to disable those or it copies your current user config to the temp user instead of inheriting from the TempAccount on those settings.

Thanks,
Ben
There is confusion here.

When you put in the % variables, before the data is sent to the web browser for the end user prompt, those variables are replaced. When the user submits the subject and body back to the server, they cannot put those kids of variables in. They have a very limited set of variables they can submit coming from the user side of things.

You should be able to do HTML coded email for this too to make the link work the way you want it.



(the href link won't come out in the forum for whatever reason...but it would have the {web_link} variable in it.)
Permission info is all part of that user folder too. The issue you have though is linux had a path like:

/home/user4/folder1/

And your windows has:

/C:/somefolder/home/user4/folder1/

etc.

And maybe not a C: but maybe a D:, or E: etc.

So this will take a mass find/replace on the files using something like notepad++. Finding references to the one path, and updating it to the new path. Test with one first to get the syntax right.

Thanks,
Ben
Tiny pic is not a link to an image. Its a link to a spam laden website where it shows an image inside of a bunch of advertisements. You can't tell a browser to display that webpage as an image...its not.

If you have an actual link to an image, then you can, and it will work.

Example

Can you disabled HTML in your message here and paste an example, maybe in a CODE block here?

--Ben
Enabling that does exclude the header as data...and you can now reference the header lines with curly brackets.

{myFirstColumn}

Otherwise its {0} {1} etc for the items to get columns.

Thanks,
Ben
https://crushftp.com/early6

Thanks,
Ben
There are likely some disk issues then or something if your limited to 40MB on OS and CrushFTP. I write using FTP on localhost at over 270MB/sec, SSD drives.

The 100ms ping is definitely a latency issue slowing down the transfer, and you will either need to do OS TCP tuning on both sides, possibly firewalls too, and routers...or use CrushTunnel to overcome this.

The 5ms site however would have to be in your LAN to get 5ms I think...just leaving the LAN would have to be more than 5ms I think...so something sin't right there. And at 5ms you would be able to get 10MB/sec speeds over FTP, or HTTP(S).

Encryption has a penalty for speed because of the CPU usage. i7 CPU, two cores used gets around 80MB/sec roughly on a single transfer. But you would always be able to still et 100Mbit even sign encryption.

If SFTP is involved, all the rules change as its a different type of protocol doing its own multiplexing and has its own limitations in max speeds.

Thanks,
Ben
On your HTTPS port, turn off redirect to https...its redundant, its already HTTPS.

20MB/sec is not good performance. I test on my gigabit LAN at 100MB speeds, and localhost at 270MB/sec speeds. So 20 is not very good...might indicate a different CPU starved issue.

Do a localhost test using plain FTP, no encryption. You should be able to get your disk speed max...and definitely faster than 20MB/sec.

HTTPS uses encryption, so you need a fast CPU to get fast speeds there. But it can still get around 1Gbit speeds.

The really important thing here though is latency. There is a magic calculation that will give you a rough estimate of the max speed to expect for FTP/HTTP(s) based on latency.

524 divided by latency milliseconds = max Mbit

So if you have 30 ping, that means 524 / 30 = 17Mbit max. For a 100ms ping, that is 5.2Mbit. So it doesn't matter if they have 1Gbit WAN, the latency kills your performance.

To get around this, companies use products like Aspera, CrushTunnel, etc. These overcome latency and still give you full speed. CrushTunnel can get up to 1Gbit speeds, just like competitors do, but requires the enterprise license to use this feature. (Still 10x to 100x cheaper than competitors.)

So what is your latency?

Thanks,
Ben
In the prefs, misc, you could enable delete partial uploads...
If you have a command line tool for these, you could perform a scan against the file after upload, or after download...

I realize after download wouldn't help exactly, but you would know at least something happened.

If these tools have a stream doe where data can be "read" or "written" through them, we could integrate with them, but I doubt they do...

Thanks,
Ben
This can be done, and fully customized too, but... you must have an enterprise level license to do this. The main reason is that in order to do "conditional" logic, you need an enterprise license. The regular license can do jumping to things, but it can't do comparisons of value 1 versus value 2 to make the decision.

Just do document how this would work...need a new job item in the jobs tab.

Step1: UserList

Branched from the UserList task item:
Step1: UserVariable, expire_millis = {parse_start:MM/dd/yyyy hh:mm:ss aa}{expire_password_when}{parse_end}
Step2: Jump conditional, {expire_millis} greater than {add_start:-86400000}{millis}{add_end}
Step2: true, send email, false, do nothing



So we firstling a variable from the user object loaded in step 1, and parse out its date format into milliseconds. Then we do a comparison with that value, versus the current milliseconds, minus 1 days worth. If the expire time is sooner than 1 day ago, we send an email.

Thanks,
Ben
A few really important things missing here...

What protocol are you using?

When you say jobs...your using the Job scheduler in CrushFTP to push files outgoing?

What is your latency in milliseconds between you and the destination.

Let me know,
Ben
These are permissions the user has (the checkbox items in the admin area of their user in the User Manager.)

Thanks,
Ben
Go to report, and choose the report called "User Folder Permissions".

Thanks,
Ben
Edit the prefs.xml file.

Set the flag "relaxed_event_grouping" to be true.

This will group events together better.

Thanks,
Ben
Whatever user ran CrushFTP is what it's running as.

So check your windows services and see what it's set to run as. If you haven't set it to run as a domain user, configure one.

Or in your copy task, use a smb:// url to write the file there.

Thanks,
Ben
Yes, this is possible.

In the plugin, near the bottom, enable use local folder if LDAP HomeDirectory not found. Disable append username to path, and disable create folder with username.

Now they will be able to login and will only get the VFS of the "import settings from CrushFTP user:" item.

Thanks,
Ben
I would need to see the job flow to understand exactly how you have it...screenshot?

You can email erectly to support if you don't have enough posts to post here.

Thanks,
Ben
Attach requires local items...file:// URL items to be able to attach them.

So an easy workaround is to Find, Copy to a temp folder, adding to the list for future use. Then exclude "ftp:*" with a task, then email all, with attachments, and finally, delete (which will delete only the local items since you previously excluded everything that was ftp://*.)

Thanks,
Ben
What web browser was used?

What exact CrushFTP version is this? v7.2.0_17?

What sort of "inaccurate" data did this?

Thanks,
Ben
Latest build now has this added. Update and you will et the new feature in the HTTP task to be able to save the result.

Thanks,
Ben
You may need to...but if it's on 7.2, you don't.

Thanks,
Ben
Please click update now to get version 7.2. Then test things again.

These issues should be fixed already.

Thanks,
Ben
The HTTP task will be able to do this soon...likely tomorrow or Monday. We just have to add the UI for it.

In the current version your using, you had to do a Find...but there is no such thing for a HTTP type...unless its a CrushFTP server and understands our request for a dir listing, or you use WebDAV.

This new build that is coming will allow you to specify the filename, and save the result response to your own file locally. Then you can Find it there, and work with it normally.

Thanks,
Ben
I replied to your direct email on this. Continuing this there.

Thanks,
Ben
Email us directly at support with your license info. This is most likely a bug with your license name.

Let me know,
Ben
New builds will have this set by default, however, you will need to download manually a new copy of the file to replace yours.

https://crushftp.com/early7/CrushFTP7_PC/WebInterface/localizations/

Thanks,
Ben
The right side of the drop down ons server's files remember prior frequent paths. Use that to grab a prior history item you have used.

But for your question no...just the history quick selector.

Thanks,
Ben
New:

  • tunnel improvements for disconnected sockets

  • multi segment downloading from S3 bucket

  • thumbnail previews now operates on VFS items and not just local file:// references (SMB, FTP, SFTP)

  • task have {working_dir} variable they can access

  • added link task

  • added support for Job references in folder monitor, alerts, and events

  • allow access to server_info variables

  • support custom java classes for task type

  • added additional cache options for FindCache, Copy/Move cache

  • added date time scenarios for Wait task

  • allow settings the supported MACs for SFTP

  • sftp can do multithreaded listings

  • support and / or operator for WebInterface search

  • added alert type for monitoring server variables

  • hide actual file path from upload exceptions

  • improved SQL speed for users stored in a DB by using better caching methods

  • allow mass updating of users as a limited admin, not just full admins

  • supports MSSQL for StatsDB now and reports

  • ignores requests for getadminxmllisting by non admins and limitedadmins

  • allow share generation to suggest a user/pass to utilize

  • support s3crush revision tracking

  • allow a user to upload and overwrite a file that was in use by them (from a leftover dead connection)

  • increased SFTP buffer size default to improve Linux performance issues

  • stop excessive user backups done at every login

  • allow events to be applied to only shared accounts a user makes

  • removed duplicate login event call

  • added user usage report

  • improved job monitor speed to ask for less info unless needed

  • server log now set to click to activate

  • added minimum upload speed, and minimum download speed restrictions for alerts.  (negative value to trigger an alert)

  • multithreaded CrushSync uploads and downloads

  • tunnel improvements for disconnected sockets

  • added accelerated multi segmented downloads from S3

  • better memory handling to keep caches clean dup after they won’t be used again

  • added ability to cache all local file items in memory for faster searches

  • added magic ports starting with 444 in tunnels to know if the port is FTP or HTTP

  • added test VFS button

  • crushftp defaults to TLS mode by default, SSLv3 is optional. (Poodle vulnerability fix)

  • supports unencrypted DMZ connection for speed

  • webInterface supports file sizes on folders when uploading with advanced mode

  • added memory based filesystem support for temps storage location

  • added {working_dir} variable to CrushTask

  • added Link task to link in other jobs to a job.

  • more variables are accessible to crush task action now, all sever_info items

  • added support for custom CrushTask java task items from 3rd parties.

  • added support for modifying FindCache references in CrushTask for Copy and Move actions

  • added date and time scenarios to WaitTask until…

  • added support to “touch” a file sign a rename

  • added garbageCollection on demand calls to CrushTask.

  • added MD5 file hash calculations on file copies in CrushTask

  • added file timing on CrushTask Copy actions

  • added {full_log} variable reference for CrushTask which could be embedded in emails

  • AS2 supports HTTPS client cert auth

  • added completion types for CrushTask of killed, cancelled, or completed

  • allow setting custom headers in HTTP CrushTask items

  • PGP task supports hinting on decrypted file size

  • added looping on email ask to attach one file at a time

  • added support for starttls on PopImap task

  • drastically improved the speed of short running jobs

  • added {MMMM} for full month names in CrushTask

  • allows any heap memory size for CrushFTPDrive and CrushSync

  • added sync now menu item to force a sync in CrushSync

  • added growl style notifications in CrushSync to warn you if sync is offline

  • added last_login tracking for User Manager accounts

  • added max_logins onto User Usage report

  • added {web_link_end} variable for Share email body

  • support quota usage for plugin based users

  • added support to generate heap dumps for admin users for troubleshooting

  • added support for Radius challenge / response system for one time use codes

  • added max expiration day config for shares and default expiration days

  • added support for responding with failed MDN messages

  • added reverse connections for ports in DMZ scenarios

  • added audit report and started tracking additional audit items like rename, delete

  • added support for events running in async mode event by event

  • added support to find user for password reset when in SQL mode for users

  • added history tracking for current uploads/downloads in progress for admin UI graphs

  • added per user password salting support

  • added job queueing for async events

  • added salt to tab delimited import

  • increased ACL lookup speed for LDAP ACL mode

  • added {size} for sharing email

  • added net mask support for ServerBeat

  • added support for events and statistics on copy/paste actions

  • added control for max event threads.

  • supports smtp mail From with pretty formatting

  • hides MACOSX garbage items from zip previews

  • added flag to control writing session logs

  • added progress bars for searching in WebInterface

  • added date/time localization support and many other enhancements for localization

  • added login page themes in the prefs for quick customizations

  • added new report JobSchedules

  • added new report AuditSummary

  • added hover over info in dashboard to see transfers that were in progress during bandwidth usage

  • check for update now looks for new builds too

  • added more controls in PGP task for signing and verifying

  • added port forward server time type

  • added mass update for banning list to past in text

  • enhanced radius plugin supporting more custom folders at login

  • enhanced LDAP plugin to support individual key mapping

  • enhanced job monitor to add breakpoints between all steps or clear breakpoints

  • added sort task item

  • added test button for custom VFS items in the User Manager

  • added option to optionally not save state or history if so desired for jobs/events

  • added multithreading capability to CrushSync for faster transfers of small items


Fixes:

  • fixed some SMB issues for rename move actions

  • fixed bug with find task when no times begin found not throwing an error

  • fixed multiple s3 buckets in single VFS

  • fixed name reference when times are unzipped

  • fixed job restoring after a server restart (multiple threads scenario)

  • webdav fixes

  • fixed max login time and dmz scenario

  • fixed encrypting the URL in VFS

  • fixed VIP movement issues with ServerBeat when both machines are offline

  • fixed memory leak with prior FTP sessions

  • fixed missing log entries for HTTP/SFTP RETR & STOR operations

  • fixed SSL manager error with blank trusted cert list

  • fixed csrf on downloads

  • Jobs UI fixes

  • login.html file redirection link restriction fix

  • sftp logging fix

  • fix overwrite not working on File objects when specified

  • fix for s3 downloads no closing properly

  • fixed client cert auth connects for HTTPS

  • fixed symlink support for SFTP client instead of removing them from listings

  • fixed SMB authentication errors

  • fixed DMZ internally routing connections through internal server not working

  • fixed bugs with known_host file support in SFTP client

  • fixed restoring a job after server restart and loading up prior cached file info

  • fixed names on CrushTask unzipped items

  • fixed file length references on CrushTask copied items

  • fixed UserList task on how it calls its subtask items

  • fixed closing connections bug with CopyTask actions

  • fixed bug with running multiple copies of the same job simultaneously (called from an event)

  • CrushSync threading fixes when multiple syncs are configured at a time

  • protects against getting banned during an upgrade

  • increased time-out for CrushSync and really slow dir listings

  • fixed bug with email templates with spaces in their name

  • restored http header access for plugins

  • fixed issue with TempAccounts and DMZ mode

  • fixed dir listings for FTPES clients and empty folders

  • fixed date locale for miniURLs

  • don’t waste connections on TempAccounts with a limit configured

  • fixed bug with upper/lower username case flag

  • fix race condition for plugin loosing active username info

  • fixed bug with access-allow-origin not working

  • fixed bug when log rolling greater than 20 days

  • fixed deadlocks scenario where the server could freeze under the right scenario

  • fixed bug taking down DMZ instance

  • improved socket cleanup for high load

  • fixed reverse proxy not matching path too

  • fixed default SQL config with datediff

  • fixed keystore arrangement of certs when adding in trusted certs

  • fixed bug with change password while having SMB or S3 filesystem

  • fixed bug in SSL test not testing the keystore properly

  • fixed bug where field login count wasn’t reset on success

  • fixed excessive logging in WebInterface actions

  • fixed DMZ bug with bad username/email requests

  • fixed linked vfs and SQL User Manager.

  • no temp rename on upload for S3Crush objects

  • fixed missing keep-alive header on redirect

  • fixed locked auth object for SMTP email

  • improved CSR generation for UK customers

  • fixed as2 message ids

  • fixed issues using TLSv1.1 and TLSv1.2

  • fixed hanging dir in SFTP when no files existed

  • fixed errors on mass uploads where cleanup is discarding them

  • fixes for UNC paths and Preview generation

  • fixed media playback with MP4 files and improved slideshow handling of MP4

  • fixed issues in the job monitor for active jobs not refreshing right

  • fixed issue with duplicating events


You need to update CrushFTP to have this. Its in 7.1.

Thanks,
Ben
Try our server now. It had an issue we have fixed.

tls_version in the prefs.xml file needs to have the SSLv2Hello item in it or webdav on windows fails.

Thanks,
Ben
 
Forum Index » Profile for spinkb » Messages posted by spinkb
Go to:   
Powered by JForum 2.1.8 © JForum Team