This commit is contained in:
Graham Steffaniak 2024-10-07 17:44:53 -05:00 committed by GitHub
parent b4b92bf852
commit 1608877813
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
50 changed files with 1281 additions and 954 deletions

View File

@ -2,6 +2,23 @@
All notable changes to this project will be documented in this file. For commit guidelines, please refer to [Standard Version](https://github.com/conventional-changelog/standard-version). All notable changes to this project will be documented in this file. For commit guidelines, please refer to [Standard Version](https://github.com/conventional-changelog/standard-version).
## v0.2.10
**New Features**:
- Allows user creation command line arguments https://github.com/gtsteffaniak/filebrowser/issues/196
- Folder sizes are always shown, leveraging the index. https://github.com/gtsteffaniak/filebrowser/issues/138
- Searching files based on filesize is no longer slower.
**Bugfixes**:
- fixes file selection usage when in single-click mode https://github.com/gtsteffaniak/filebrowser/issues/214
- Fixed displayed search context on root directory
- Fixed issue searching "smaller than" actually returned files "larger than"
**Notes**:
- Memory usage from index is reduced by ~40%
- Indexing time has increased 2x due to the extra processing time required to calculate directory sizes.
- File size calcuations use 1024 base vs previous 1000 base (matching windows explorer)
## v0.2.9 ## v0.2.9
This release focused on UI navigation experience. Improving keyboard navigation and adds right click context menu. This release focused on UI navigation experience. Improving keyboard navigation and adds right click context menu.

View File

@ -6,7 +6,7 @@
</p> </p>
<h3 align="center">FileBrowser Quantum - A modern web-based file manager</h3> <h3 align="center">FileBrowser Quantum - A modern web-based file manager</h3>
<p align="center"> <p align="center">
<img width="800" src="https://github.com/user-attachments/assets/8ba93582-aba2-4996-8ac3-25f763a2e596" title="Main Screenshot"> <img width="800" src="https://private-user-images.githubusercontent.com/42989099/367975355-3d6f4619-4985-4ce3-952f-286510dff4f1.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjgxNTA2MjEsIm5iZiI6MTcyODE1MDMyMSwicGF0aCI6Ii80Mjk4OTA5OS8zNjc5NzUzNTUtM2Q2ZjQ2MTktNDk4NS00Y2UzLTk1MmYtMjg2NTEwZGZmNGYxLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDEwMDUlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQxMDA1VDE3NDUyMVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTg1OGNlMWM3M2I1ZmY3MDcxMGU1ODc3N2ZkMjI5YWQ3YzEyODRmNDU0ZDkxMjJhNTU0ZGY1MDQ2YmIwOWRmMTgmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.mOl0Hm70XmQEk-DPzx1FbwrpxNMDAqb-WDprs1HK-mc" title="Main Screenshot">
</p> </p>
> [!WARNING] > [!WARNING]
@ -18,9 +18,9 @@
FileBrowser Quantum is a fork of the filebrowser opensource project with the FileBrowser Quantum is a fork of the filebrowser opensource project with the
following changes: following changes:
1. [x] Enhanced lightning fast indexed search 1. [x] Efficiently indexed files
- Real-time results as you type - Real-time search results as you type
- Works with more type filters - Search Works with more type filters
- Enhanced interactive results page. - Enhanced interactive results page.
2. [x] Revamped and simplified GUI navbar and sidebar menu. 2. [x] Revamped and simplified GUI navbar and sidebar menu.
- Additional compact view mode as well as refreshed view mode - Additional compact view mode as well as refreshed view mode
@ -131,39 +131,30 @@ Not using docker (not recommended), download your binary from releases and run w
./filebrowser -c <filebrowser.yml or other /path/to/config.yaml> ./filebrowser -c <filebrowser.yml or other /path/to/config.yaml>
``` ```
## Command Line Usage
There are very few commands available. There are 3 actions done via command line:
1. Running the program, as shown on install step. Only argument used is the config file, if you choose to override default "filebrowser.yaml"
2. Checking the version info via `./filebrowser version`
3. Updating the DB, which currently only supports adding users via `./filebrowser set -u username,password [-a] [-s "example/scope"]`
## Configuration ## Configuration
All configuration is now done via a single configuration file: All configuration is now done via a single configuration file:
`filebrowser.yaml`, here is an example of minimal [configuration `filebrowser.yaml`, here is an example of minimal [configuration
file](./backend/filebrowser.yaml). file](./backend/filebrowser.yaml).
View the [Configuration Help Page](./configuration.md) for available View the [Configuration Help Page](./docs/configuration.md) for available
configuration options and other help. configuration options and other help.
## Migration from filebrowser/filebrowser ## Migration from filebrowser/filebrowser
If you currently use the original opensource filebrowser If you currently use the original filebrowser but want to try using this.
but want to try using this. I recommend you start fresh without I recommend you start fresh without reusing the database. If you want to
reusing the database, but there are a few things you'll need to do if you migrate your existing database to FileBrowser Quantum, visit the [migration
must migrate: readme](./docs/migration.md)
1. Create a configuration file as mentioned above.
2. Copy your database file from the original filebrowser to the path of
the new one.
3. Update the configuration file to use the database (under server in
filebrowser.yml)
4. If you are using docker, update the docker-compose file or docker run
command to use the config file as described in the install section
above.
5. If you are not using docker, just make sure you run filebrowser -c
filebrowser.yml and have a valid filebrowser config.
The filebrowser Quantum application should run with the same user and rules that
you have from the original. But keep in mind the differences that are
mentioned at the top of this readme.
## Comparison Chart ## Comparison Chart
@ -217,4 +208,4 @@ Chromecast support | ❌ | ❌ | ✅ | ❌ | ❌ | ❌ |
## Roadmap ## Roadmap
see [Roadmap Page](./roadmap.md) see [Roadmap Page](./docs/roadmap.md)

View File

@ -7,26 +7,27 @@
PASS PASS
ok github.com/gtsteffaniak/filebrowser/diskcache 0.004s ok github.com/gtsteffaniak/filebrowser/diskcache 0.004s
? github.com/gtsteffaniak/filebrowser/errors [no test files] ? github.com/gtsteffaniak/filebrowser/errors [no test files]
2024/10/07 12:46:34 could not update unknown type: unknown
goos: linux goos: linux
goarch: amd64 goarch: amd64
pkg: github.com/gtsteffaniak/filebrowser/files pkg: github.com/gtsteffaniak/filebrowser/files
cpu: 11th Gen Intel(R) Core(TM) i5-11320H @ 3.20GHz cpu: 11th Gen Intel(R) Core(TM) i5-11320H @ 3.20GHz
BenchmarkFillIndex-8 10 3559830 ns/op 274639 B/op 2026 allocs/op BenchmarkFillIndex-8 10 3847878 ns/op 758424 B/op 5567 allocs/op
BenchmarkSearchAllIndexes-8 10 31912612 ns/op 20545741 B/op 312477 allocs/op BenchmarkSearchAllIndexes-8 10 780431 ns/op 173444 B/op 2014 allocs/op
PASS PASS
ok github.com/gtsteffaniak/filebrowser/files 0.417s ok github.com/gtsteffaniak/filebrowser/files 0.073s
PASS PASS
ok github.com/gtsteffaniak/filebrowser/fileutils 0.002s ok github.com/gtsteffaniak/filebrowser/fileutils 0.003s
2024/08/27 16:16:13 h: 401 <nil> 2024/10/07 12:46:34 h: 401 <nil>
2024/08/27 16:16:13 h: 401 <nil> 2024/10/07 12:46:34 h: 401 <nil>
2024/08/27 16:16:13 h: 401 <nil> 2024/10/07 12:46:34 h: 401 <nil>
2024/08/27 16:16:13 h: 401 <nil> 2024/10/07 12:46:34 h: 401 <nil>
2024/08/27 16:16:13 h: 401 <nil> 2024/10/07 12:46:34 h: 401 <nil>
2024/08/27 16:16:13 h: 401 <nil> 2024/10/07 12:46:34 h: 401 <nil>
PASS PASS
ok github.com/gtsteffaniak/filebrowser/http 0.100s ok github.com/gtsteffaniak/filebrowser/http 0.080s
PASS PASS
ok github.com/gtsteffaniak/filebrowser/img 0.124s ok github.com/gtsteffaniak/filebrowser/img 0.137s
PASS PASS
ok github.com/gtsteffaniak/filebrowser/rules 0.002s ok github.com/gtsteffaniak/filebrowser/rules 0.002s
PASS PASS
@ -38,4 +39,5 @@ ok github.com/gtsteffaniak/filebrowser/settings 0.004s
? github.com/gtsteffaniak/filebrowser/storage/bolt [no test files] ? github.com/gtsteffaniak/filebrowser/storage/bolt [no test files]
PASS PASS
ok github.com/gtsteffaniak/filebrowser/users 0.002s ok github.com/gtsteffaniak/filebrowser/users 0.002s
? github.com/gtsteffaniak/filebrowser/utils [no test files]
? github.com/gtsteffaniak/filebrowser/version [no test files] ? github.com/gtsteffaniak/filebrowser/version [no test files]

View File

@ -11,28 +11,28 @@ import (
"os" "os"
"os/signal" "os/signal"
"strconv" "strconv"
"strings"
"syscall" "syscall"
"embed" "embed"
"github.com/spf13/pflag"
"github.com/spf13/cobra"
"github.com/gtsteffaniak/filebrowser/auth"
"github.com/gtsteffaniak/filebrowser/diskcache" "github.com/gtsteffaniak/filebrowser/diskcache"
"github.com/gtsteffaniak/filebrowser/files" "github.com/gtsteffaniak/filebrowser/files"
fbhttp "github.com/gtsteffaniak/filebrowser/http" fbhttp "github.com/gtsteffaniak/filebrowser/http"
"github.com/gtsteffaniak/filebrowser/img" "github.com/gtsteffaniak/filebrowser/img"
"github.com/gtsteffaniak/filebrowser/settings" "github.com/gtsteffaniak/filebrowser/settings"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
"github.com/gtsteffaniak/filebrowser/utils"
"github.com/gtsteffaniak/filebrowser/version" "github.com/gtsteffaniak/filebrowser/version"
) )
//go:embed dist/* //go:embed dist/*
var assets embed.FS var assets embed.FS
var nonEmbededFS = os.Getenv("FILEBROWSER_NO_EMBEDED") == "true" var (
nonEmbededFS = os.Getenv("FILEBROWSER_NO_EMBEDED") == "true"
)
type dirFS struct { type dirFS struct {
http.Dir http.Dir
@ -42,24 +42,131 @@ func (d dirFS) Open(name string) (fs.File, error) {
return d.Dir.Open(name) return d.Dir.Open(name)
} }
func init() { func getStore(config string) (*storage.Storage, bool) {
// Define a flag for the config option (-c or --config) // Use the config file (global flag)
configFlag := pflag.StringP("config", "c", "filebrowser.yaml", "Path to the config file") log.Printf("Using Config file : %v", config)
// Bind the flags to the pflag command line parser settings.Initialize(config)
pflag.CommandLine.AddGoFlagSet(flag.CommandLine) store, hasDB, err := storage.InitializeDb(settings.Config.Server.Database)
pflag.Parse() if err != nil {
log.Printf("Initializing FileBrowser Quantum (%v) with config file: %v \n", version.Version, *configFlag) log.Fatal("could not load db info: ", err)
log.Println("Embeded Frontend:", !nonEmbededFS) }
settings.Initialize(*configFlag) return store, hasDB
} }
var rootCmd = &cobra.Command{ func generalUsage() {
Use: "filebrowser", fmt.Printf(`usage: ./html-web-crawler <command> [options] --urls <urls>
Run: python(func(cmd *cobra.Command, args []string, d pythonData) { commands:
serverConfig := settings.Config.Server collect Collect data from URLs
if !d.hadDB { crawl Crawl URLs and collect data
quickSetup(d) install Install chrome browser for javascript enabled scraping.
Note: Consider instead to install via native package manager,
then set "CHROME_EXECUTABLE" in the environment
` + "\n")
} }
func StartFilebrowser() {
// Global flags
var configPath string
var help bool
// Override the default usage output to use generalUsage()
flag.Usage = generalUsage
flag.StringVar(&configPath, "c", "filebrowser.yaml", "Path to the config file.")
flag.BoolVar(&help, "h", false, "Get help about commands")
// Parse global flags (before subcommands)
flag.Parse() // print generalUsage on error
// Show help if requested
if help {
generalUsage()
return
}
// Create a new FlagSet for the 'set' subcommand
setCmd := flag.NewFlagSet("set", flag.ExitOnError)
var user, scope, dbConfig string
var asAdmin bool
setCmd.StringVar(&user, "u", "", "Comma-separated username and password: \"set -u <username>,<password>\"")
setCmd.BoolVar(&asAdmin, "a", false, "Create user as admin user, used in combination with -u")
setCmd.StringVar(&scope, "s", "", "Specify a user scope, otherwise default user config scope is used")
setCmd.StringVar(&dbConfig, "c", "filebrowser.yaml", "Path to the config file.")
// Parse subcommand flags only if a subcommand is specified
if len(os.Args) > 1 {
switch os.Args[1] {
case "set":
err := setCmd.Parse(os.Args)
if err != nil {
setCmd.PrintDefaults()
os.Exit(1)
}
userInfo := strings.Split(user, ",")
if len(userInfo) < 2 {
fmt.Println("not enough info to create user: \"set -u username,password\"")
setCmd.PrintDefaults()
os.Exit(1)
}
username := userInfo[0]
password := userInfo[1]
getStore(dbConfig)
// Create the user logic
if asAdmin {
log.Printf("Creating user as admin: %s\n", username)
} else {
log.Printf("Creating user: %s\n", username)
}
newUser := users.User{
Username: username,
Password: password,
}
if scope != "" {
newUser.Scope = scope
}
err = storage.CreateUser(newUser, asAdmin)
if err != nil {
log.Fatal("Could not create user: ", err)
}
return
case "version":
fmt.Println("FileBrowser Quantum - A modern web-based file manager")
fmt.Printf("Version : %v\n", version.Version)
fmt.Printf("Commit : %v\n", version.CommitSHA)
fmt.Printf("Release Info : https://github.com/gtsteffaniak/filebrowser/releases/tag/%v\n", version.Version)
return
}
}
store, dbExists := getStore(configPath)
indexingInterval := fmt.Sprint(settings.Config.Server.IndexingInterval, " minutes")
if !settings.Config.Server.Indexing {
indexingInterval = "disabled"
}
database := fmt.Sprintf("Using existing database : %v", settings.Config.Server.Database)
if !dbExists {
database = fmt.Sprintf("Creating new database : %v", settings.Config.Server.Database)
}
log.Printf("Initializing FileBrowser Quantum (%v)\n", version.Version)
log.Println("Embeded frontend :", !nonEmbededFS)
log.Println(database)
log.Println("Sources :", settings.Config.Server.Root)
log.Print("Indexing interval : ", indexingInterval)
serverConfig := settings.Config.Server
// initialize indexing and schedule indexing ever n minutes (default 5)
go files.InitializeIndex(serverConfig.IndexingInterval, serverConfig.Indexing)
if err := rootCMD(store, &serverConfig); err != nil {
log.Fatal("Error starting filebrowser:", err)
}
}
func cleanupHandler(listener net.Listener, c chan os.Signal) { //nolint:interfacer
sig := <-c
log.Printf("Caught signal %s: shutting down.", sig)
listener.Close()
os.Exit(0)
}
func rootCMD(store *storage.Storage, serverConfig *settings.Server) error {
if serverConfig.NumImageProcessors < 1 { if serverConfig.NumImageProcessors < 1 {
log.Fatal("Image resize workers count could not be < 1") log.Fatal("Image resize workers count could not be < 1")
} }
@ -79,31 +186,30 @@ var rootCmd = &cobra.Command{
// No-op cache if no cacheDir is specified // No-op cache if no cacheDir is specified
fileCache = diskcache.NewNoOp() fileCache = diskcache.NewNoOp()
} }
// initialize indexing and schedule indexing ever n minutes (default 5)
go files.InitializeIndex(serverConfig.IndexingInterval, serverConfig.Indexing) fbhttp.SetupEnv(store, serverConfig, fileCache)
_, err := os.Stat(serverConfig.Root) _, err := os.Stat(serverConfig.Root)
checkErr(fmt.Sprint("cmd os.Stat ", serverConfig.Root), err) utils.CheckErr(fmt.Sprint("cmd os.Stat ", serverConfig.Root), err)
var listener net.Listener var listener net.Listener
address := serverConfig.Address + ":" + strconv.Itoa(serverConfig.Port) address := serverConfig.Address + ":" + strconv.Itoa(serverConfig.Port)
switch { switch {
case serverConfig.Socket != "": case serverConfig.Socket != "":
listener, err = net.Listen("unix", serverConfig.Socket) listener, err = net.Listen("unix", serverConfig.Socket)
checkErr("net.Listen", err) utils.CheckErr("net.Listen", err)
socketPerm, err := cmd.Flags().GetUint32("socket-perm") //nolint:govet err = os.Chmod(serverConfig.Socket, os.FileMode(0666)) // socket-perm
checkErr("cmd.Flags().GetUint32", err) utils.CheckErr("os.Chmod", err)
err = os.Chmod(serverConfig.Socket, os.FileMode(socketPerm))
checkErr("os.Chmod", err)
case serverConfig.TLSKey != "" && serverConfig.TLSCert != "": case serverConfig.TLSKey != "" && serverConfig.TLSCert != "":
cer, err := tls.LoadX509KeyPair(serverConfig.TLSCert, serverConfig.TLSKey) //nolint:govet cer, err := tls.LoadX509KeyPair(serverConfig.TLSCert, serverConfig.TLSKey) //nolint:govet
checkErr("tls.LoadX509KeyPair", err) utils.CheckErr("tls.LoadX509KeyPair", err)
listener, err = tls.Listen("tcp", address, &tls.Config{ listener, err = tls.Listen("tcp", address, &tls.Config{
MinVersion: tls.VersionTLS12, MinVersion: tls.VersionTLS12,
Certificates: []tls.Certificate{cer}}, Certificates: []tls.Certificate{cer}},
) )
checkErr("tls.Listen", err) utils.CheckErr("tls.Listen", err)
default: default:
listener, err = net.Listen("tcp", address) listener, err = net.Listen("tcp", address)
checkErr("net.Listen", err) utils.CheckErr("net.Listen", err)
} }
sigc := make(chan os.Signal, 1) sigc := make(chan os.Signal, 1)
signal.Notify(sigc, os.Interrupt, syscall.SIGTERM) signal.Notify(sigc, os.Interrupt, syscall.SIGTERM)
@ -113,8 +219,8 @@ var rootCmd = &cobra.Command{
if err != nil { if err != nil {
log.Fatal("Could not embed frontend. Does backend/cmd/dist exist? Must be built and exist first") log.Fatal("Could not embed frontend. Does backend/cmd/dist exist? Must be built and exist first")
} }
handler, err := fbhttp.NewHandler(imgSvc, fileCache, d.store, &serverConfig, assetsFs) handler, err := fbhttp.NewHandler(imgSvc, assetsFs)
checkErr("fbhttp.NewHandler", err) utils.CheckErr("fbhttp.NewHandler", err)
defer listener.Close() defer listener.Close()
log.Println("Listening on", listener.Addr().String()) log.Println("Listening on", listener.Addr().String())
//nolint: gosec //nolint: gosec
@ -123,8 +229,8 @@ var rootCmd = &cobra.Command{
} }
} else { } else {
assetsFs := dirFS{Dir: http.Dir("frontend/dist")} assetsFs := dirFS{Dir: http.Dir("frontend/dist")}
handler, err := fbhttp.NewHandler(imgSvc, fileCache, d.store, &serverConfig, assetsFs) handler, err := fbhttp.NewHandler(imgSvc, assetsFs)
checkErr("fbhttp.NewHandler", err) utils.CheckErr("fbhttp.NewHandler", err)
defer listener.Close() defer listener.Close()
log.Println("Listening on", listener.Addr().String()) log.Println("Listening on", listener.Addr().String())
//nolint: gosec //nolint: gosec
@ -132,54 +238,5 @@ var rootCmd = &cobra.Command{
log.Fatalf("Could not start server on port %d: %v", serverConfig.Port, err) log.Fatalf("Could not start server on port %d: %v", serverConfig.Port, err)
} }
} }
return nil
}, pythonConfig{allowNoDB: true}),
}
func StartFilebrowser() {
if err := rootCmd.Execute(); err != nil {
log.Fatal("Error starting filebrowser:", err)
}
}
func cleanupHandler(listener net.Listener, c chan os.Signal) { //nolint:interfacer
sig := <-c
log.Printf("Caught signal %s: shutting down.", sig)
listener.Close()
os.Exit(0)
}
func quickSetup(d pythonData) {
settings.Config.Auth.Key = generateKey()
if settings.Config.Auth.Method == "noauth" {
err := d.store.Auth.Save(&auth.NoAuth{})
checkErr("d.store.Auth.Save", err)
} else {
settings.Config.Auth.Method = "password"
err := d.store.Auth.Save(&auth.JSONAuth{})
checkErr("d.store.Auth.Save", err)
}
err := d.store.Settings.Save(&settings.Config)
checkErr("d.store.Settings.Save", err)
err = d.store.Settings.SaveServer(&settings.Config.Server)
checkErr("d.store.Settings.SaveServer", err)
user := users.ApplyDefaults(users.User{})
user.Username = settings.Config.Auth.AdminUsername
user.Password = settings.Config.Auth.AdminPassword
user.Perm.Admin = true
user.Scope = "./"
user.DarkMode = true
user.ViewMode = "normal"
user.LockPassword = false
user.Perm = settings.Permissions{
Create: true,
Rename: true,
Modify: true,
Delete: true,
Share: true,
Download: true,
Admin: true,
}
err = d.store.Users.Save(&user)
checkErr("d.store.Users.Save", err)
} }

View File

@ -6,7 +6,9 @@ import (
"github.com/spf13/cobra" "github.com/spf13/cobra"
"github.com/gtsteffaniak/filebrowser/settings" "github.com/gtsteffaniak/filebrowser/settings"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
"github.com/gtsteffaniak/filebrowser/utils"
) )
func init() { func init() {
@ -40,27 +42,27 @@ including 'index_end'.`,
return nil return nil
}, },
Run: python(func(cmd *cobra.Command, args []string, d pythonData) { Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
i, err := strconv.Atoi(args[0]) i, err := strconv.Atoi(args[0])
checkErr("strconv.Atoi", err) utils.CheckErr("strconv.Atoi", err)
f := i f := i
if len(args) == 2 { //nolint:gomnd if len(args) == 2 { //nolint:gomnd
f, err = strconv.Atoi(args[1]) f, err = strconv.Atoi(args[1])
checkErr("strconv.Atoi", err) utils.CheckErr("strconv.Atoi", err)
} }
user := func(u *users.User) { user := func(u *users.User) {
u.Rules = append(u.Rules[:i], u.Rules[f+1:]...) u.Rules = append(u.Rules[:i], u.Rules[f+1:]...)
err := d.store.Users.Save(u) err := store.Users.Save(u)
checkErr("d.store.Users.Save", err) utils.CheckErr("store.Users.Save", err)
} }
global := func(s *settings.Settings) { global := func(s *settings.Settings) {
s.Rules = append(s.Rules[:i], s.Rules[f+1:]...) s.Rules = append(s.Rules[:i], s.Rules[f+1:]...)
err := d.store.Settings.Save(s) err := store.Settings.Save(s)
checkErr("d.store.Settings.Save", err) utils.CheckErr("store.Settings.Save", err)
} }
runRules(d.store, cmd, user, global) runRules(store, cmd, user, global)
}, pythonConfig{}), }),
} }

View File

@ -10,10 +10,10 @@ import (
"github.com/gtsteffaniak/filebrowser/settings" "github.com/gtsteffaniak/filebrowser/settings"
"github.com/gtsteffaniak/filebrowser/storage" "github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
"github.com/gtsteffaniak/filebrowser/utils"
) )
func init() { func init() {
rootCmd.AddCommand(rulesCmd)
rulesCmd.PersistentFlags().StringP("username", "u", "", "username of user to which the rules apply") rulesCmd.PersistentFlags().StringP("username", "u", "", "username of user to which the rules apply")
rulesCmd.PersistentFlags().UintP("id", "i", 0, "id of user to which the rules apply") rulesCmd.PersistentFlags().UintP("id", "i", 0, "id of user to which the rules apply")
} }
@ -33,7 +33,7 @@ func runRules(st *storage.Storage, cmd *cobra.Command, usersFn func(*users.User)
id := getUserIdentifier(cmd.Flags()) id := getUserIdentifier(cmd.Flags())
if id != nil { if id != nil {
user, err := st.Users.Get("", id) user, err := st.Users.Get("", id)
checkErr("st.Users.Get", err) utils.CheckErr("st.Users.Get", err)
if usersFn != nil { if usersFn != nil {
usersFn(user) usersFn(user)
@ -44,7 +44,7 @@ func runRules(st *storage.Storage, cmd *cobra.Command, usersFn func(*users.User)
} }
s, err := st.Settings.Get() s, err := st.Settings.Get()
checkErr("st.Settings.Get", err) utils.CheckErr("st.Settings.Get", err)
if globalFn != nil { if globalFn != nil {
globalFn(s) globalFn(s)

View File

@ -7,7 +7,9 @@ import (
"github.com/gtsteffaniak/filebrowser/rules" "github.com/gtsteffaniak/filebrowser/rules"
"github.com/gtsteffaniak/filebrowser/settings" "github.com/gtsteffaniak/filebrowser/settings"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
"github.com/gtsteffaniak/filebrowser/utils"
) )
func init() { func init() {
@ -21,7 +23,7 @@ var rulesAddCmd = &cobra.Command{
Short: "Add a global rule or user rule", Short: "Add a global rule or user rule",
Long: `Add a global rule or user rule.`, Long: `Add a global rule or user rule.`,
Args: cobra.ExactArgs(1), Args: cobra.ExactArgs(1),
Run: python(func(cmd *cobra.Command, args []string, d pythonData) { Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
allow := mustGetBool(cmd.Flags(), "allow") allow := mustGetBool(cmd.Flags(), "allow")
regex := mustGetBool(cmd.Flags(), "regex") regex := mustGetBool(cmd.Flags(), "regex")
exp := args[0] exp := args[0]
@ -43,16 +45,16 @@ var rulesAddCmd = &cobra.Command{
user := func(u *users.User) { user := func(u *users.User) {
u.Rules = append(u.Rules, rule) u.Rules = append(u.Rules, rule)
err := d.store.Users.Save(u) err := store.Users.Save(u)
checkErr("d.store.Users.Save", err) utils.CheckErr("store.Users.Save", err)
} }
global := func(s *settings.Settings) { global := func(s *settings.Settings) {
s.Rules = append(s.Rules, rule) s.Rules = append(s.Rules, rule)
err := d.store.Settings.Save(s) err := store.Settings.Save(s)
checkErr("d.store.Settings.Save", err) utils.CheckErr("store.Settings.Save", err)
} }
runRules(d.store, cmd, user, global) runRules(store, cmd, user, global)
}, pythonConfig{}), }),
} }

View File

@ -1,6 +1,7 @@
package cmd package cmd
import ( import (
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/spf13/cobra" "github.com/spf13/cobra"
) )
@ -13,7 +14,7 @@ var rulesLsCommand = &cobra.Command{
Short: "List global rules or user specific rules", Short: "List global rules or user specific rules",
Long: `List global rules or user specific rules.`, Long: `List global rules or user specific rules.`,
Args: cobra.NoArgs, Args: cobra.NoArgs,
Run: python(func(cmd *cobra.Command, args []string, d pythonData) { Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
runRules(d.store, cmd, nil, nil) runRules(store, cmd, nil, nil)
}, pythonConfig{}), }),
} }

View File

@ -11,10 +11,6 @@ import (
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
) )
func init() {
rootCmd.AddCommand(usersCmd)
}
var usersCmd = &cobra.Command{ var usersCmd = &cobra.Command{
Use: "users", Use: "users",
Short: "Users management utility", Short: "Users management utility",

View File

@ -3,7 +3,9 @@ package cmd
import ( import (
"github.com/spf13/cobra" "github.com/spf13/cobra"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
"github.com/gtsteffaniak/filebrowser/utils"
) )
func init() { func init() {
@ -15,26 +17,26 @@ var usersAddCmd = &cobra.Command{
Short: "Create a new user", Short: "Create a new user",
Long: `Create a new user and add it to the database.`, Long: `Create a new user and add it to the database.`,
Args: cobra.ExactArgs(2), //nolint:gomnd Args: cobra.ExactArgs(2), //nolint:gomnd
Run: python(func(cmd *cobra.Command, args []string, d pythonData) { Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
user := &users.User{ user := &users.User{
Username: args[0], Username: args[0],
Password: args[1], Password: args[1],
LockPassword: mustGetBool(cmd.Flags(), "lockPassword"), LockPassword: mustGetBool(cmd.Flags(), "lockPassword"),
} }
servSettings, err := d.store.Settings.GetServer() servSettings, err := store.Settings.GetServer()
checkErr("d.store.Settings.GetServer()", err) utils.CheckErr("store.Settings.GetServer()", err)
// since getUserDefaults() polluted s.Defaults.Scope // since getUserDefaults() polluted s.Defaults.Scope
// which makes the Scope not the one saved in the db // which makes the Scope not the one saved in the db
// we need the right s.Defaults.Scope here // we need the right s.Defaults.Scope here
s2, err := d.store.Settings.Get() s2, err := store.Settings.Get()
checkErr("d.store.Settings.Get()", err) utils.CheckErr("store.Settings.Get()", err)
userHome, err := s2.MakeUserDir(user.Username, user.Scope, servSettings.Root) userHome, err := s2.MakeUserDir(user.Username, user.Scope, servSettings.Root)
checkErr("s2.MakeUserDir", err) utils.CheckErr("s2.MakeUserDir", err)
user.Scope = userHome user.Scope = userHome
err = d.store.Users.Save(user) err = store.Users.Save(user)
checkErr("d.store.Users.Save", err) utils.CheckErr("store.Users.Save", err)
printUsers([]*users.User{user}) printUsers([]*users.User{user})
}, pythonConfig{}), }),
} }

View File

@ -1,6 +1,8 @@
package cmd package cmd
import ( import (
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/utils"
"github.com/spf13/cobra" "github.com/spf13/cobra"
) )
@ -14,11 +16,11 @@ var usersExportCmd = &cobra.Command{
Long: `Export all users to a json or yaml file. Please indicate the Long: `Export all users to a json or yaml file. Please indicate the
path to the file where you want to write the users.`, path to the file where you want to write the users.`,
Args: jsonYamlArg, Args: jsonYamlArg,
Run: python(func(cmd *cobra.Command, args []string, d pythonData) { Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
list, err := d.store.Users.Gets("") list, err := store.Users.Gets("")
checkErr("d.store.Users.Gets", err) utils.CheckErr("store.Users.Gets", err)
err = marshal(args[0], list) err = marshal(args[0], list)
checkErr("marshal", err) utils.CheckErr("marshal", err)
}, pythonConfig{}), }),
} }

View File

@ -3,7 +3,9 @@ package cmd
import ( import (
"github.com/spf13/cobra" "github.com/spf13/cobra"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
"github.com/gtsteffaniak/filebrowser/utils"
) )
func init() { func init() {
@ -26,7 +28,7 @@ var usersLsCmd = &cobra.Command{
Run: findUsers, Run: findUsers,
} }
var findUsers = python(func(cmd *cobra.Command, args []string, d pythonData) { var findUsers = cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
var ( var (
list []*users.User list []*users.User
user *users.User user *users.User
@ -36,16 +38,16 @@ var findUsers = python(func(cmd *cobra.Command, args []string, d pythonData) {
if len(args) == 1 { if len(args) == 1 {
username, id := parseUsernameOrID(args[0]) username, id := parseUsernameOrID(args[0])
if username != "" { if username != "" {
user, err = d.store.Users.Get("", username) user, err = store.Users.Get("", username)
} else { } else {
user, err = d.store.Users.Get("", id) user, err = store.Users.Get("", id)
} }
list = []*users.User{user} list = []*users.User{user}
} else { } else {
list, err = d.store.Users.Gets("") list, err = store.Users.Gets("")
} }
checkErr("findUsers", err) utils.CheckErr("findUsers", err)
printUsers(list) printUsers(list)
}, pythonConfig{}) })

View File

@ -8,7 +8,9 @@ import (
"github.com/spf13/cobra" "github.com/spf13/cobra"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
"github.com/gtsteffaniak/filebrowser/utils"
) )
func init() { func init() {
@ -25,47 +27,47 @@ file. You can use this command to import new users to your
installation. For that, just don't place their ID on the files installation. For that, just don't place their ID on the files
list or set it to 0.`, list or set it to 0.`,
Args: jsonYamlArg, Args: jsonYamlArg,
Run: python(func(cmd *cobra.Command, args []string, d pythonData) { Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
fd, err := os.Open(args[0]) fd, err := os.Open(args[0])
checkErr("os.Open", err) utils.CheckErr("os.Open", err)
defer fd.Close() defer fd.Close()
list := []*users.User{} list := []*users.User{}
err = unmarshal(args[0], &list) err = unmarshal(args[0], &list)
checkErr("unmarshal", err) utils.CheckErr("unmarshal", err)
if mustGetBool(cmd.Flags(), "replace") { if mustGetBool(cmd.Flags(), "replace") {
oldUsers, err := d.store.Users.Gets("") oldUsers, err := store.Users.Gets("")
checkErr("d.store.Users.Gets", err) utils.CheckErr("store.Users.Gets", err)
err = marshal("users.backup.json", list) err = marshal("users.backup.json", list)
checkErr("marshal users.backup.json", err) utils.CheckErr("marshal users.backup.json", err)
for _, user := range oldUsers { for _, user := range oldUsers {
err = d.store.Users.Delete(user.ID) err = store.Users.Delete(user.ID)
checkErr("d.store.Users.Delete", err) utils.CheckErr("store.Users.Delete", err)
} }
} }
overwrite := mustGetBool(cmd.Flags(), "overwrite") overwrite := mustGetBool(cmd.Flags(), "overwrite")
for _, user := range list { for _, user := range list {
onDB, err := d.store.Users.Get("", user.ID) onDB, err := store.Users.Get("", user.ID)
// User exists in DB. // User exists in DB.
if err == nil { if err == nil {
if !overwrite { if !overwrite {
newErr := errors.New("user " + strconv.Itoa(int(user.ID)) + " is already registered") newErr := errors.New("user " + strconv.Itoa(int(user.ID)) + " is already registered")
checkErr("", newErr) utils.CheckErr("", newErr)
} }
// If the usernames mismatch, check if there is another one in the DB // If the usernames mismatch, check if there is another one in the DB
// with the new username. If there is, print an error and cancel the // with the new username. If there is, print an error and cancel the
// operation // operation
if user.Username != onDB.Username { if user.Username != onDB.Username {
if conflictuous, err := d.store.Users.Get("", user.Username); err == nil { //nolint:govet if conflictuous, err := store.Users.Get("", user.Username); err == nil { //nolint:govet
newErr := usernameConflictError(user.Username, conflictuous.ID, user.ID) newErr := usernameConflictError(user.Username, conflictuous.ID, user.ID)
checkErr("usernameConflictError", newErr) utils.CheckErr("usernameConflictError", newErr)
} }
} }
} else { } else {
@ -74,10 +76,10 @@ list or set it to 0.`,
user.ID = 0 user.ID = 0
} }
err = d.store.Users.Save(user) err = store.Users.Save(user)
checkErr("d.store.Users.Save", err) utils.CheckErr("store.Users.Save", err)
} }
}, pythonConfig{}), }),
} }
func usernameConflictError(username string, originalID, newID uint) error { func usernameConflictError(username string, originalID, newID uint) error {

View File

@ -3,6 +3,8 @@ package cmd
import ( import (
"log" "log"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/utils"
"github.com/spf13/cobra" "github.com/spf13/cobra"
) )
@ -15,17 +17,17 @@ var usersRmCmd = &cobra.Command{
Short: "Delete a user by username or id", Short: "Delete a user by username or id",
Long: `Delete a user by username or id`, Long: `Delete a user by username or id`,
Args: cobra.ExactArgs(1), Args: cobra.ExactArgs(1),
Run: python(func(cmd *cobra.Command, args []string, d pythonData) { Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
username, id := parseUsernameOrID(args[0]) username, id := parseUsernameOrID(args[0])
var err error var err error
if username != "" { if username != "" {
err = d.store.Users.Delete(username) err = store.Users.Delete(username)
} else { } else {
err = d.store.Users.Delete(id) err = store.Users.Delete(id)
} }
checkErr("usersRmCmd", err) utils.CheckErr("usersRmCmd", err)
log.Println("user deleted successfully") log.Println("user deleted successfully")
}, pythonConfig{}), }),
} }

View File

@ -3,7 +3,9 @@ package cmd
import ( import (
"github.com/spf13/cobra" "github.com/spf13/cobra"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
"github.com/gtsteffaniak/filebrowser/utils"
) )
func init() { func init() {
@ -16,7 +18,7 @@ var usersUpdateCmd = &cobra.Command{
Long: `Updates an existing user. Set the flags for the Long: `Updates an existing user. Set the flags for the
options you want to change.`, options you want to change.`,
Args: cobra.ExactArgs(1), Args: cobra.ExactArgs(1),
Run: python(func(cmd *cobra.Command, args []string, d pythonData) { Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
username, id := parseUsernameOrID(args[0]) username, id := parseUsernameOrID(args[0])
var ( var (
@ -25,14 +27,14 @@ options you want to change.`,
) )
if id != 0 { if id != 0 {
user, err = d.store.Users.Get("", id) user, err = store.Users.Get("", id)
} else { } else {
user, err = d.store.Users.Get("", username) user, err = store.Users.Get("", username)
} }
checkErr("d.store.Users.Get", err) utils.CheckErr("store.Users.Get", err)
err = d.store.Users.Update(user) err = store.Users.Update(user)
checkErr("d.store.Users.Update", err) utils.CheckErr("store.Users.Update", err)
printUsers([]*users.User{user}) printUsers([]*users.User{user})
}, pythonConfig{}), }),
} }

View File

@ -3,113 +3,42 @@ package cmd
import ( import (
"encoding/json" "encoding/json"
"errors" "errors"
"fmt"
"log"
"os" "os"
"path/filepath" "path/filepath"
"github.com/asdine/storm/v3"
"github.com/goccy/go-yaml" "github.com/goccy/go-yaml"
"github.com/spf13/cobra" "github.com/spf13/cobra"
"github.com/spf13/pflag" "github.com/spf13/pflag"
"github.com/gtsteffaniak/filebrowser/settings"
"github.com/gtsteffaniak/filebrowser/storage" "github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/storage/bolt" "github.com/gtsteffaniak/filebrowser/utils"
) )
func checkErr(source string, err error) {
if err != nil {
log.Fatalf("%s: %v", source, err)
}
}
func mustGetString(flags *pflag.FlagSet, flag string) string { func mustGetString(flags *pflag.FlagSet, flag string) string {
s, err := flags.GetString(flag) s, err := flags.GetString(flag)
checkErr("mustGetString", err) utils.CheckErr("mustGetString", err)
return s return s
} }
func mustGetBool(flags *pflag.FlagSet, flag string) bool { func mustGetBool(flags *pflag.FlagSet, flag string) bool {
b, err := flags.GetBool(flag) b, err := flags.GetBool(flag)
checkErr("mustGetBool", err) utils.CheckErr("mustGetBool", err)
return b return b
} }
func mustGetUint(flags *pflag.FlagSet, flag string) uint { func mustGetUint(flags *pflag.FlagSet, flag string) uint {
b, err := flags.GetUint(flag) b, err := flags.GetUint(flag)
checkErr("mustGetUint", err) utils.CheckErr("mustGetUint", err)
return b return b
} }
func generateKey() []byte {
k, err := settings.GenerateKey()
checkErr("generateKey", err)
return k
}
type cobraFunc func(cmd *cobra.Command, args []string) type cobraFunc func(cmd *cobra.Command, args []string)
type pythonFunc func(cmd *cobra.Command, args []string, data pythonData) type pythonFunc func(cmd *cobra.Command, args []string, store *storage.Storage)
type pythonConfig struct {
noDB bool
allowNoDB bool
}
type pythonData struct {
hadDB bool
store *storage.Storage
}
func dbExists(path string) (bool, error) {
stat, err := os.Stat(path)
if err == nil {
return stat.Size() != 0, nil
}
if os.IsNotExist(err) {
d := filepath.Dir(path)
_, err = os.Stat(d)
if os.IsNotExist(err) {
if err := os.MkdirAll(d, 0700); err != nil { //nolint:govet,gomnd
return false, err
}
return false, nil
}
}
return false, err
}
func python(fn pythonFunc, cfg pythonConfig) cobraFunc {
return func(cmd *cobra.Command, args []string) {
data := pythonData{hadDB: true}
path := settings.Config.Server.Database
exists, err := dbExists(path)
if err != nil {
panic(err)
} else if exists && cfg.noDB {
log.Fatal(path + " already exists")
} else if !exists && !cfg.noDB && !cfg.allowNoDB {
log.Fatal(path + " does not exist. Please run 'filebrowser config init' first.")
}
data.hadDB = exists
db, err := storm.Open(path)
checkErr(fmt.Sprintf("storm.Open path %v", path), err)
defer db.Close()
data.store, err = bolt.NewStorage(db)
checkErr("bolt.NewStorage", err)
fn(cmd, args, data)
}
}
func marshal(filename string, data interface{}) error { func marshal(filename string, data interface{}) error {
fd, err := os.Create(filename) fd, err := os.Create(filename)
checkErr("os.Create", err) utils.CheckErr("os.Create", err)
defer fd.Close() defer fd.Close()
switch ext := filepath.Ext(filename); ext { switch ext := filepath.Ext(filename); ext {
@ -127,7 +56,7 @@ func marshal(filename string, data interface{}) error {
func unmarshal(filename string, data interface{}) error { func unmarshal(filename string, data interface{}) error {
fd, err := os.Open(filename) fd, err := os.Open(filename)
checkErr("os.Open", err) utils.CheckErr("os.Open", err)
defer fd.Close() defer fd.Close()
switch ext := filepath.Ext(filename); ext { switch ext := filepath.Ext(filename); ext {
@ -152,3 +81,8 @@ func jsonYamlArg(cmd *cobra.Command, args []string) error {
return errors.New("invalid format: " + ext) return errors.New("invalid format: " + ext)
} }
} }
func cobraCmd(fn pythonFunc) cobraFunc {
return func(cmd *cobra.Command, args []string) {
}
}

View File

@ -1,21 +0,0 @@
package cmd
import (
"fmt"
"github.com/spf13/cobra"
"github.com/gtsteffaniak/filebrowser/version"
)
func init() {
rootCmd.AddCommand(versionCmd)
}
var versionCmd = &cobra.Command{
Use: "version",
Short: "Print the version number",
Run: func(cmd *cobra.Command, args []string) {
fmt.Println("File Browser " + version.Version + "/" + version.CommitSHA)
},
}

View File

@ -91,7 +91,7 @@ func ParseSearch(value string) *SearchOptions {
opts.LargerThan = updateSize(size) opts.LargerThan = updateSize(size)
} }
if strings.HasPrefix(filter, "smallerThan=") { if strings.HasPrefix(filter, "smallerThan=") {
opts.Conditions["larger"] = true opts.Conditions["smaller"] = true
size := strings.TrimPrefix(filter, "smallerThan=") size := strings.TrimPrefix(filter, "smallerThan=")
opts.SmallerThan = updateSize(size) opts.SmallerThan = updateSize(size)
} }

View File

@ -0,0 +1,154 @@
package files
import (
"fmt"
"testing"
"github.com/stretchr/testify/assert"
)
// Helper function to create error messages dynamically
func errorMsg(extension, expectedType string, expectedMatch bool) string {
matchStatus := "to match"
if !expectedMatch {
matchStatus = "to not match"
}
return fmt.Sprintf("Expected %s %s type '%s'", extension, matchStatus, expectedType)
}
func TestIsMatchingType(t *testing.T) {
// Test cases where IsMatchingType should return true
trueTestCases := []struct {
extension string
expectedType string
}{
{".pdf", "pdf"},
{".doc", "doc"},
{".docx", "doc"},
{".json", "text"},
{".sh", "text"},
{".zip", "archive"},
{".rar", "archive"},
}
for _, tc := range trueTestCases {
assert.True(t, IsMatchingType(tc.extension, tc.expectedType), errorMsg(tc.extension, tc.expectedType, true))
}
// Test cases where IsMatchingType should return false
falseTestCases := []struct {
extension string
expectedType string
}{
{".mp4", "doc"},
{".mp4", "text"},
{".mp4", "archive"},
}
for _, tc := range falseTestCases {
assert.False(t, IsMatchingType(tc.extension, tc.expectedType), errorMsg(tc.extension, tc.expectedType, false))
}
}
func TestUpdateSize(t *testing.T) {
// Helper function for size error messages
sizeErrorMsg := func(input string, expected, actual int) string {
return fmt.Sprintf("Expected size for input '%s' to be %d, got %d", input, expected, actual)
}
// Test cases for updateSize
testCases := []struct {
input string
expected int
}{
{"150", 150},
{"invalid", 100},
{"", 100},
}
for _, tc := range testCases {
actual := updateSize(tc.input)
assert.Equal(t, tc.expected, actual, sizeErrorMsg(tc.input, tc.expected, actual))
}
}
func TestIsDoc(t *testing.T) {
// Test cases where IsMatchingType should return true for document types
docTrueTestCases := []struct {
extension string
expectedType string
}{
{".doc", "doc"},
{".pdf", "doc"},
}
for _, tc := range docTrueTestCases {
assert.True(t, IsMatchingType(tc.extension, tc.expectedType), errorMsg(tc.extension, tc.expectedType, true))
}
// Test case where IsMatchingType should return false for document types
docFalseTestCases := []struct {
extension string
expectedType string
}{
{".mp4", "doc"},
}
for _, tc := range docFalseTestCases {
assert.False(t, IsMatchingType(tc.extension, tc.expectedType), errorMsg(tc.extension, tc.expectedType, false))
}
}
func TestIsText(t *testing.T) {
// Test cases where IsMatchingType should return true for text types
textTrueTestCases := []struct {
extension string
expectedType string
}{
{".json", "text"},
{".sh", "text"},
}
for _, tc := range textTrueTestCases {
assert.True(t, IsMatchingType(tc.extension, tc.expectedType), errorMsg(tc.extension, tc.expectedType, true))
}
// Test case where IsMatchingType should return false for text types
textFalseTestCases := []struct {
extension string
expectedType string
}{
{".mp4", "text"},
}
for _, tc := range textFalseTestCases {
assert.False(t, IsMatchingType(tc.extension, tc.expectedType), errorMsg(tc.extension, tc.expectedType, false))
}
}
func TestIsArchive(t *testing.T) {
// Test cases where IsMatchingType should return true for archive types
archiveTrueTestCases := []struct {
extension string
expectedType string
}{
{".zip", "archive"},
{".rar", "archive"},
}
for _, tc := range archiveTrueTestCases {
assert.True(t, IsMatchingType(tc.extension, tc.expectedType), errorMsg(tc.extension, tc.expectedType, true))
}
// Test case where IsMatchingType should return false for archive types
archiveFalseTestCases := []struct {
extension string
expectedType string
}{
{".mp4", "archive"},
}
for _, tc := range archiveFalseTestCases {
assert.False(t, IsMatchingType(tc.extension, tc.expectedType), errorMsg(tc.extension, tc.expectedType, false))
}
}

View File

@ -21,18 +21,26 @@ import (
"github.com/gtsteffaniak/filebrowser/errors" "github.com/gtsteffaniak/filebrowser/errors"
"github.com/gtsteffaniak/filebrowser/rules" "github.com/gtsteffaniak/filebrowser/rules"
"github.com/gtsteffaniak/filebrowser/settings" "github.com/gtsteffaniak/filebrowser/settings"
"github.com/gtsteffaniak/filebrowser/users"
) )
var ( var (
bytesInMegabyte int64 = 1000000
pathMutexes = make(map[string]*sync.Mutex) pathMutexes = make(map[string]*sync.Mutex)
pathMutexesMu sync.Mutex // Mutex to protect the pathMutexes map pathMutexesMu sync.Mutex // Mutex to protect the pathMutexes map
) )
type ReducedItem struct {
Name string `json:"name"`
Size int64 `json:"size"`
ModTime time.Time `json:"modified"`
IsDir bool `json:"isDir,omitempty"`
Type string `json:"type"`
}
// FileInfo describes a file. // FileInfo describes a file.
// reduced item is non-recursive reduced "Items", used to pass flat items array
type FileInfo struct { type FileInfo struct {
*Listing Items []*FileInfo `json:"-"`
ReducedItems []ReducedItem `json:"items,omitempty"`
Path string `json:"path,omitempty"` Path string `json:"path,omitempty"`
Name string `json:"name"` Name string `json:"name"`
Size int64 `json:"size"` Size int64 `json:"size"`
@ -47,6 +55,8 @@ type FileInfo struct {
Content string `json:"content,omitempty"` Content string `json:"content,omitempty"`
Checksums map[string]string `json:"checksums,omitempty"` Checksums map[string]string `json:"checksums,omitempty"`
Token string `json:"token,omitempty"` Token string `json:"token,omitempty"`
NumDirs int `json:"numDirs"`
NumFiles int `json:"numFiles"`
} }
// FileOptions are the options when getting a file info. // FileOptions are the options when getting a file info.
@ -61,26 +71,11 @@ type FileOptions struct {
Content bool Content bool
} }
// Sorting constants // Legacy file info method, only called on non-indexed directories.
const ( // Once indexing completes for the first time, NewFileInfo is never called.
SortingByName = "name"
SortingBySize = "size"
SortingByModified = "modified"
)
// Listing is a collection of files.
type Listing struct {
Items []*FileInfo `json:"items"`
Path string `json:"path"`
NumDirs int `json:"numDirs"`
NumFiles int `json:"numFiles"`
Sorting users.Sorting `json:"sorting"`
}
// NewFileInfo creates a File object from a path and a given user. This File
// object will be automatically filled depending on if it is a directory
// or a file. If it's a video file, it will also detect any subtitles.
func NewFileInfo(opts FileOptions) (*FileInfo, error) { func NewFileInfo(opts FileOptions) (*FileInfo, error) {
index := GetIndex(rootPath)
if !opts.Checker.Check(opts.Path) { if !opts.Checker.Check(opts.Path) {
return nil, os.ErrPermission return nil, os.ErrPermission
} }
@ -93,6 +88,26 @@ func NewFileInfo(opts FileOptions) (*FileInfo, error) {
if err = file.readListing(opts.Path, opts.Checker, opts.ReadHeader); err != nil { if err = file.readListing(opts.Path, opts.Checker, opts.ReadHeader); err != nil {
return nil, err return nil, err
} }
cleanedItems := []ReducedItem{}
for _, item := range file.Items {
// This is particularly useful for root of index, while indexing hasn't finished.
// adds the directory sizes for directories that have been indexed already.
if item.IsDir {
adjustedPath := index.makeIndexPath(opts.Path+"/"+item.Name, true)
info, _ := index.GetMetadataInfo(adjustedPath)
item.Size = info.Size
}
cleanedItems = append(cleanedItems, ReducedItem{
Name: item.Name,
Size: item.Size,
IsDir: item.IsDir,
ModTime: item.ModTime,
Type: item.Type,
})
}
file.Items = nil
file.ReducedItems = cleanedItems
return file, nil return file, nil
} }
err = file.detectType(opts.Path, opts.Modify, opts.Content, true) err = file.detectType(opts.Path, opts.Modify, opts.Content, true)
@ -102,6 +117,7 @@ func NewFileInfo(opts FileOptions) (*FileInfo, error) {
} }
return file, err return file, err
} }
func FileInfoFaster(opts FileOptions) (*FileInfo, error) { func FileInfoFaster(opts FileOptions) (*FileInfo, error) {
// Lock access for the specific path // Lock access for the specific path
pathMutex := getMutex(opts.Path) pathMutex := getMutex(opts.Path)
@ -133,12 +149,11 @@ func FileInfoFaster(opts FileOptions) (*FileInfo, error) {
file, err := NewFileInfo(opts) file, err := NewFileInfo(opts)
return file, err return file, err
} }
info, exists := index.GetMetadataInfo(adjustedPath) info, exists := index.GetMetadataInfo(adjustedPath + "/" + filepath.Base(opts.Path))
if !exists || info.Name == "" { if !exists || info.Name == "" {
return &FileInfo{}, errors.ErrEmptyKey return NewFileInfo(opts)
} }
return &info, nil return &info, nil
} }
func RefreshFileInfo(opts FileOptions) error { func RefreshFileInfo(opts FileOptions) error {
@ -491,9 +506,8 @@ func (i *FileInfo) readListing(path string, checker rules.Checker, readHeader bo
return err return err
} }
listing := &Listing{ listing := &FileInfo{
Items: []*FileInfo{}, Items: []*FileInfo{},
Path: i.Path,
NumDirs: 0, NumDirs: 0,
NumFiles: 0, NumFiles: 0,
} }
@ -548,7 +562,7 @@ func (i *FileInfo) readListing(path string, checker rules.Checker, readHeader bo
listing.Items = append(listing.Items, file) listing.Items = append(listing.Items, file)
} }
i.Listing = listing i.Items = listing.Items
return nil return nil
} }

View File

@ -1,7 +1,6 @@
package files package files
import ( import (
"bytes"
"log" "log"
"os" "os"
"path/filepath" "path/filepath"
@ -12,23 +11,12 @@ import (
"github.com/gtsteffaniak/filebrowser/settings" "github.com/gtsteffaniak/filebrowser/settings"
) )
type Directory struct {
Metadata map[string]FileInfo
Files string
}
type File struct {
Name string
IsDir bool
}
type Index struct { type Index struct {
Root string Root string
Directories map[string]Directory Directories map[string]FileInfo
NumDirs int NumDirs int
NumFiles int NumFiles int
inProgress bool inProgress bool
quickList []File
LastIndexed time.Time LastIndexed time.Time
mu sync.RWMutex mu sync.RWMutex
} }
@ -50,16 +38,12 @@ func indexingScheduler(intervalMinutes uint32) {
rootPath = settings.Config.Server.Root rootPath = settings.Config.Server.Root
} }
si := GetIndex(rootPath) si := GetIndex(rootPath)
log.Printf("Indexing Files...")
log.Printf("Configured to run every %v minutes", intervalMinutes)
log.Printf("Indexing from root: %s", si.Root)
for { for {
startTime := time.Now() startTime := time.Now()
// Set the indexing flag to indicate that indexing is in progress // Set the indexing flag to indicate that indexing is in progress
si.resetCount() si.resetCount()
// Perform the indexing operation // Perform the indexing operation
err := si.indexFiles(si.Root) err := si.indexFiles(si.Root)
si.quickList = []File{}
// Reset the indexing flag to indicate that indexing has finished // Reset the indexing flag to indicate that indexing has finished
si.inProgress = false si.inProgress = false
// Update the LastIndexed time // Update the LastIndexed time
@ -81,78 +65,114 @@ func indexingScheduler(intervalMinutes uint32) {
// Define a function to recursively index files and directories // Define a function to recursively index files and directories
func (si *Index) indexFiles(path string) error { func (si *Index) indexFiles(path string) error {
// Check if the current directory has been modified since the last indexing // Ensure path is cleaned and normalized
adjustedPath := si.makeIndexPath(path, true) adjustedPath := si.makeIndexPath(path, true)
// Open the directory
dir, err := os.Open(path) dir, err := os.Open(path)
if err != nil { if err != nil {
// Directory must have been deleted, remove it from the index // If the directory can't be opened (e.g., deleted), remove it from the index
si.RemoveDirectory(adjustedPath) si.RemoveDirectory(adjustedPath)
return err
} }
defer dir.Close()
dirInfo, err := dir.Stat() dirInfo, err := dir.Stat()
if err != nil { if err != nil {
dir.Close()
return err return err
} }
// Compare the last modified time of the directory with the last indexed time // Check if the directory is already up-to-date
lastIndexed := si.LastIndexed if dirInfo.ModTime().Before(si.LastIndexed) {
if dirInfo.ModTime().Before(lastIndexed) {
dir.Close()
return nil return nil
} }
// Read the directory contents // Read directory contents
files, err := dir.Readdir(-1) files, err := dir.Readdir(-1)
if err != nil { if err != nil {
return err return err
} }
dir.Close()
si.UpdateQuickList(files) // Recursively process files and directories
si.InsertFiles(path) fileInfos := []*FileInfo{}
// done separately for memory efficiency on recursion var totalSize int64
si.InsertDirs(path) var numDirs, numFiles int
for _, file := range files {
parentInfo := &FileInfo{
Name: file.Name(),
Size: file.Size(),
ModTime: file.ModTime(),
IsDir: file.IsDir(),
}
childInfo, err := si.InsertInfo(path, parentInfo)
if err != nil {
// Log error, but continue processing other files
continue
}
// Accumulate directory size and items
totalSize += childInfo.Size
if childInfo.IsDir {
numDirs++
} else {
numFiles++
}
_ = childInfo.detectType(path, true, false, false)
fileInfos = append(fileInfos, childInfo)
}
// Create FileInfo for the current directory
dirFileInfo := &FileInfo{
Items: fileInfos,
Name: filepath.Base(path),
Size: totalSize,
ModTime: dirInfo.ModTime(),
CacheTime: time.Now(),
IsDir: true,
NumDirs: numDirs,
NumFiles: numFiles,
}
// Add directory to index
si.mu.Lock()
si.Directories[adjustedPath] = *dirFileInfo
si.NumDirs += numDirs
si.NumFiles += numFiles
si.mu.Unlock()
return nil return nil
} }
func (si *Index) InsertFiles(path string) { // InsertInfo function to handle adding a file or directory into the index
adjustedPath := si.makeIndexPath(path, true) func (si *Index) InsertInfo(parentPath string, file *FileInfo) (*FileInfo, error) {
subDirectory := Directory{} filePath := filepath.Join(parentPath, file.Name)
buffer := bytes.Buffer{}
for _, f := range si.GetQuickList() { // Check if it's a directory and recursively index it
if !f.IsDir { if file.IsDir {
buffer.WriteString(f.Name + ";") // Recursively index directory
si.UpdateCount("files") err := si.indexFiles(filePath)
}
}
// Use GetMetadataInfo and SetFileMetadata for safer read and write operations
subDirectory.Files = buffer.String()
si.SetDirectoryInfo(adjustedPath, subDirectory)
}
func (si *Index) InsertDirs(path string) {
for _, f := range si.GetQuickList() {
if f.IsDir {
adjustedPath := si.makeIndexPath(path, true)
if _, exists := si.Directories[adjustedPath]; exists {
si.UpdateCount("dirs")
// Add or update the directory in the map
if adjustedPath == "/" {
si.SetDirectoryInfo("/"+f.Name, Directory{})
} else {
si.SetDirectoryInfo(adjustedPath+"/"+f.Name, Directory{})
}
}
err := si.indexFiles(path + "/" + f.Name)
if err != nil { if err != nil {
if err.Error() == "invalid argument" { return nil, err
log.Printf("Could not index \"%v\": %v \n", path, "Permission Denied")
} else {
log.Printf("Could not index \"%v\": %v \n", path, err)
}
} }
// Return directory info from the index
adjustedPath := si.makeIndexPath(filePath, true)
si.mu.RLock()
dirInfo := si.Directories[adjustedPath]
si.mu.RUnlock()
return &dirInfo, nil
} }
// Create FileInfo for regular files
fileInfo := &FileInfo{
Path: filePath,
Name: file.Name,
Size: file.Size,
ModTime: file.ModTime,
IsDir: false,
} }
return fileInfo, nil
} }
func (si *Index) makeIndexPath(subPath string, isDir bool) string { func (si *Index) makeIndexPath(subPath string, isDir bool) string {
@ -171,5 +191,8 @@ func (si *Index) makeIndexPath(subPath string, isDir bool) string {
} else if !isDir { } else if !isDir {
adjustedPath = filepath.Dir(adjustedPath) adjustedPath = filepath.Dir(adjustedPath)
} }
if !strings.HasPrefix(adjustedPath, "/") {
adjustedPath = "/" + adjustedPath
}
return adjustedPath return adjustedPath
} }

View File

@ -2,6 +2,7 @@ package files
import ( import (
"encoding/json" "encoding/json"
"fmt"
"math/rand" "math/rand"
"reflect" "reflect"
"testing" "testing"
@ -23,18 +24,26 @@ func BenchmarkFillIndex(b *testing.B) {
func (si *Index) createMockData(numDirs, numFilesPerDir int) { func (si *Index) createMockData(numDirs, numFilesPerDir int) {
for i := 0; i < numDirs; i++ { for i := 0; i < numDirs; i++ {
dirName := generateRandomPath(rand.Intn(3) + 1) dirName := generateRandomPath(rand.Intn(3) + 1)
files := []File{} files := []*FileInfo{} // Slice of FileInfo
// Append a new Directory to the slice
// Simulating files and directories with FileInfo
for j := 0; j < numFilesPerDir; j++ { for j := 0; j < numFilesPerDir; j++ {
newFile := File{ newFile := &FileInfo{
Name: "file-" + getRandomTerm() + getRandomExtension(), Name: "file-" + getRandomTerm() + getRandomExtension(),
IsDir: false, IsDir: false,
Size: rand.Int63n(1000), // Random size
ModTime: time.Now().Add(-time.Duration(rand.Intn(100)) * time.Hour), // Random mod time
} }
files = append(files, newFile) files = append(files, newFile)
} }
si.UpdateQuickListForTests(files)
si.InsertFiles(dirName) // Simulate inserting files into index
si.InsertDirs(dirName) for _, file := range files {
_, err := si.InsertInfo(dirName, file)
if err != nil {
fmt.Println("Error inserting file:", err)
}
}
} }
} }

View File

@ -2,7 +2,6 @@ package files
import ( import (
"math/rand" "math/rand"
"os"
"path/filepath" "path/filepath"
"sort" "sort"
"strings" "strings"
@ -30,12 +29,17 @@ func (si *Index) Search(search string, scope string, sourceSession string) ([]st
continue continue
} }
si.mu.Lock() si.mu.Lock()
defer si.mu.Unlock()
for dirName, dir := range si.Directories { for dirName, dir := range si.Directories {
isDir := true isDir := true
files := strings.Split(dir.Files, ";") files := []string{}
for _, item := range dir.Items {
if !item.IsDir {
files = append(files, item.Name)
}
}
value, found := sessionInProgress.Load(sourceSession) value, found := sessionInProgress.Load(sourceSession)
if !found || value != runningHash { if !found || value != runningHash {
si.mu.Unlock()
return []string{}, map[string]map[string]bool{} return []string{}, map[string]map[string]bool{}
} }
if count > maxSearchResults { if count > maxSearchResults {
@ -46,7 +50,9 @@ func (si *Index) Search(search string, scope string, sourceSession string) ([]st
continue // path not matched continue // path not matched
} }
fileTypes := map[string]bool{} fileTypes := map[string]bool{}
matches, fileType := containsSearchTerm(dirName, searchTerm, *searchOptions, isDir, fileTypes) si.mu.Unlock()
matches, fileType := si.containsSearchTerm(dirName, searchTerm, *searchOptions, isDir, fileTypes)
si.mu.Lock()
if matches { if matches {
fileListTypes[pathName] = fileType fileListTypes[pathName] = fileType
matching = append(matching, pathName) matching = append(matching, pathName)
@ -67,8 +73,9 @@ func (si *Index) Search(search string, scope string, sourceSession string) ([]st
} }
fullName := strings.TrimLeft(pathName+file, "/") fullName := strings.TrimLeft(pathName+file, "/")
fileTypes := map[string]bool{} fileTypes := map[string]bool{}
si.mu.Unlock()
matches, fileType := containsSearchTerm(fullName, searchTerm, *searchOptions, isDir, fileTypes) matches, fileType := si.containsSearchTerm(fullName, searchTerm, *searchOptions, isDir, fileTypes)
si.mu.Lock()
if !matches { if !matches {
continue continue
} }
@ -77,6 +84,7 @@ func (si *Index) Search(search string, scope string, sourceSession string) ([]st
count++ count++
} }
} }
si.mu.Unlock()
} }
// Sort the strings based on the number of elements after splitting by "/" // Sort the strings based on the number of elements after splitting by "/"
sort.Slice(matching, func(i, j int) bool { sort.Slice(matching, func(i, j int) bool {
@ -102,65 +110,88 @@ func scopedPathNameFilter(pathName string, scope string, isDir bool) string {
return pathName return pathName
} }
func containsSearchTerm(pathName string, searchTerm string, options SearchOptions, isDir bool, fileTypes map[string]bool) (bool, map[string]bool) { func (si *Index) containsSearchTerm(pathName string, searchTerm string, options SearchOptions, isDir bool, fileTypes map[string]bool) (bool, map[string]bool) {
largerThan := int64(options.LargerThan) * 1024 * 1024
smallerThan := int64(options.SmallerThan) * 1024 * 1024
conditions := options.Conditions conditions := options.Conditions
path := getLastPathComponent(pathName) fileName := filepath.Base(pathName)
// Convert to lowercase once adjustedPath := si.makeIndexPath(pathName, isDir)
// Convert to lowercase if not exact match
if !conditions["exact"] { if !conditions["exact"] {
path = strings.ToLower(path) fileName = strings.ToLower(fileName)
searchTerm = strings.ToLower(searchTerm) searchTerm = strings.ToLower(searchTerm)
} }
if strings.Contains(path, searchTerm) {
// Calculate fileSize only if needed // Check if the file name contains the search term
if !strings.Contains(fileName, searchTerm) {
return false, map[string]bool{}
}
// Initialize file size and fileTypes map
var fileSize int64 var fileSize int64
matchesAllConditions := true extension := filepath.Ext(fileName)
extension := filepath.Ext(path)
// Collect file types
for _, k := range AllFiletypeOptions { for _, k := range AllFiletypeOptions {
if IsMatchingType(extension, k) { if IsMatchingType(extension, k) {
fileTypes[k] = true fileTypes[k] = true
} }
} }
fileTypes["dir"] = isDir fileTypes["dir"] = isDir
// Get file info if needed for size-related conditions
if largerThan > 0 || smallerThan > 0 {
fileInfo, exists := si.GetMetadataInfo(adjustedPath)
if !exists {
return false, fileTypes
} else if !isDir {
// Look for specific file in ReducedItems
for _, item := range fileInfo.ReducedItems {
lower := strings.ToLower(item.Name)
if strings.Contains(lower, searchTerm) {
if item.Size == 0 {
return false, fileTypes
}
fileSize = item.Size
break
}
}
} else {
fileSize = fileInfo.Size
}
if fileSize == 0 {
return false, fileTypes
}
}
// Evaluate all conditions
for t, v := range conditions { for t, v := range conditions {
if t == "exact" { if t == "exact" {
continue continue
} }
var matchesCondition bool
switch t { switch t {
case "larger": case "larger":
if fileSize == 0 { if largerThan > 0 {
fileSize = getFileSize(pathName) if fileSize <= largerThan {
return false, fileTypes
}
} }
matchesCondition = fileSize > int64(options.LargerThan)*bytesInMegabyte
case "smaller": case "smaller":
if fileSize == 0 { if smallerThan > 0 {
fileSize = getFileSize(pathName) if fileSize >= smallerThan {
return false, fileTypes
}
} }
matchesCondition = fileSize < int64(options.SmallerThan)*bytesInMegabyte
default: default:
matchesCondition = v == fileTypes[t] // Handle other file type conditions
} notMatchType := v != fileTypes[t]
if !matchesCondition { if notMatchType {
matchesAllConditions = false return false, fileTypes
} }
} }
return matchesAllConditions, fileTypes
}
// Clear variables and return
return false, map[string]bool{}
} }
func getFileSize(filepath string) int64 { return true, fileTypes
fileInfo, err := os.Stat(rootPath + "/" + filepath)
if err != nil {
return 0
}
return fileInfo.Size()
}
func getLastPathComponent(path string) string {
// Use filepath.Base to extract the last component of the path
return filepath.Base(path)
} }
func generateRandomHash(length int) string { func generateRandomHash(length int) string {

View File

@ -11,7 +11,7 @@ func BenchmarkSearchAllIndexes(b *testing.B) {
InitializeIndex(5, false) InitializeIndex(5, false)
si := GetIndex(rootPath) si := GetIndex(rootPath)
si.createMockData(50, 3) // 1000 dirs, 3 files per dir si.createMockData(50, 3) // 50 dirs, 3 files per dir
// Generate 100 random search terms // Generate 100 random search terms
searchTerms := generateRandomSearchTerms(100) searchTerms := generateRandomSearchTerms(100)
@ -26,87 +26,90 @@ func BenchmarkSearchAllIndexes(b *testing.B) {
} }
} }
// loop over test files and compare output
func TestParseSearch(t *testing.T) { func TestParseSearch(t *testing.T) {
value := ParseSearch("my test search") tests := []struct {
want := &SearchOptions{ input string
Conditions: map[string]bool{ want *SearchOptions
"exact": false, }{
}, {
input: "my test search",
want: &SearchOptions{
Conditions: map[string]bool{"exact": false},
Terms: []string{"my test search"}, Terms: []string{"my test search"},
}
if !reflect.DeepEqual(value, want) {
t.Fatalf("\n got: %+v\n want: %+v", value, want)
}
value = ParseSearch("case:exact my|test|search")
want = &SearchOptions{
Conditions: map[string]bool{
"exact": true,
}, },
},
{
input: "case:exact my|test|search",
want: &SearchOptions{
Conditions: map[string]bool{"exact": true},
Terms: []string{"my", "test", "search"}, Terms: []string{"my", "test", "search"},
}
if !reflect.DeepEqual(value, want) {
t.Fatalf("\n got: %+v\n want: %+v", value, want)
}
value = ParseSearch("type:largerThan=100 type:smallerThan=1000 test")
want = &SearchOptions{
Conditions: map[string]bool{
"exact": false,
"larger": true,
}, },
},
{
input: "type:largerThan=100 type:smallerThan=1000 test",
want: &SearchOptions{
Conditions: map[string]bool{"exact": false, "larger": true, "smaller": true},
Terms: []string{"test"}, Terms: []string{"test"},
LargerThan: 100, LargerThan: 100,
SmallerThan: 1000, SmallerThan: 1000,
}
if !reflect.DeepEqual(value, want) {
t.Fatalf("\n got: %+v\n want: %+v", value, want)
}
value = ParseSearch("type:audio thisfile")
want = &SearchOptions{
Conditions: map[string]bool{
"exact": false,
"audio": true,
}, },
},
{
input: "type:audio thisfile",
want: &SearchOptions{
Conditions: map[string]bool{"exact": false, "audio": true},
Terms: []string{"thisfile"}, Terms: []string{"thisfile"},
},
},
} }
if !reflect.DeepEqual(value, want) {
t.Fatalf("\n got: %+v\n want: %+v", value, want) for _, tt := range tests {
t.Run(tt.input, func(t *testing.T) {
value := ParseSearch(tt.input)
if !reflect.DeepEqual(value, tt.want) {
t.Fatalf("\n got: %+v\n want: %+v", value, tt.want)
}
})
} }
} }
func TestSearchWhileIndexing(t *testing.T) { func TestSearchWhileIndexing(t *testing.T) {
InitializeIndex(5, false) InitializeIndex(5, false)
si := GetIndex(rootPath) si := GetIndex(rootPath)
// Generate 100 random search terms
// Generate 100 random search terms
searchTerms := generateRandomSearchTerms(10) searchTerms := generateRandomSearchTerms(10)
for i := 0; i < 5; i++ { for i := 0; i < 5; i++ {
// Execute the SearchAllIndexes function go si.createMockData(100, 100) // Creating mock data concurrently
go si.createMockData(100, 100) // 1000 dirs, 3 files per dir
for _, term := range searchTerms { for _, term := range searchTerms {
go si.Search(term, "/", "test") go si.Search(term, "/", "test") // Search concurrently
} }
} }
} }
func TestSearchIndexes(t *testing.T) { func TestSearchIndexes(t *testing.T) {
index := Index{ index := Index{
Directories: map[string]Directory{ Directories: map[string]FileInfo{
"test": { "test": {Items: []*FileInfo{{Name: "audio1.wav"}}},
Files: "audio1.wav;", "test/path": {Items: []*FileInfo{{Name: "file.txt"}}},
"new/test": {Items: []*FileInfo{
{Name: "audio.wav"},
{Name: "video.mp4"},
{Name: "video.MP4"},
}},
"new/test/path": {Items: []*FileInfo{{Name: "archive.zip"}}},
"/firstDir": {Items: []*FileInfo{
{Name: "archive.zip", Size: 100},
{Name: "thisIsDir", IsDir: true, Size: 2 * 1024 * 1024},
}},
"/firstDir/thisIsDir": {
Items: []*FileInfo{
{Name: "hi.txt"},
}, },
"test/path": { Size: 2 * 1024 * 1024,
Files: "file.txt;",
},
"new": {},
"new/test": {
Files: "audio.wav;video.mp4;video.MP4;",
},
"new/test/path": {
Files: "archive.zip;",
}, },
}, },
} }
tests := []struct { tests := []struct {
search string search string
scope string scope string
@ -118,7 +121,7 @@ func TestSearchIndexes(t *testing.T) {
scope: "/new/", scope: "/new/",
expectedResult: []string{"test/audio.wav"}, expectedResult: []string{"test/audio.wav"},
expectedTypes: map[string]map[string]bool{ expectedTypes: map[string]map[string]bool{
"test/audio.wav": map[string]bool{"audio": true, "dir": false}, "test/audio.wav": {"audio": true, "dir": false},
}, },
}, },
{ {
@ -126,16 +129,41 @@ func TestSearchIndexes(t *testing.T) {
scope: "/", scope: "/",
expectedResult: []string{"test/", "new/test/"}, expectedResult: []string{"test/", "new/test/"},
expectedTypes: map[string]map[string]bool{ expectedTypes: map[string]map[string]bool{
"test/": map[string]bool{"dir": true}, "test/": {"dir": true},
"new/test/": map[string]bool{"dir": true}, "new/test/": {"dir": true},
}, },
}, },
{ {
search: "archive", search: "archive",
scope: "/", scope: "/",
expectedResult: []string{"new/test/path/archive.zip"}, expectedResult: []string{"firstDir/archive.zip", "new/test/path/archive.zip"},
expectedTypes: map[string]map[string]bool{ expectedTypes: map[string]map[string]bool{
"new/test/path/archive.zip": map[string]bool{"archive": true, "dir": false}, "new/test/path/archive.zip": {"archive": true, "dir": false},
"firstDir/archive.zip": {"archive": true, "dir": false},
},
},
{
search: "arch",
scope: "/firstDir",
expectedResult: []string{"archive.zip"},
expectedTypes: map[string]map[string]bool{
"archive.zip": {"archive": true, "dir": false},
},
},
{
search: "isdir",
scope: "/",
expectedResult: []string{"firstDir/thisIsDir/"},
expectedTypes: map[string]map[string]bool{
"firstDir/thisIsDir/": {"dir": true},
},
},
{
search: "dir type:largerThan=1",
scope: "/",
expectedResult: []string{"firstDir/thisIsDir/"},
expectedTypes: map[string]map[string]bool{
"firstDir/thisIsDir/": {"dir": true},
}, },
}, },
{ {
@ -146,18 +174,17 @@ func TestSearchIndexes(t *testing.T) {
"new/test/video.MP4", "new/test/video.MP4",
}, },
expectedTypes: map[string]map[string]bool{ expectedTypes: map[string]map[string]bool{
"new/test/video.MP4": map[string]bool{"video": true, "dir": false}, "new/test/video.MP4": {"video": true, "dir": false},
"new/test/video.mp4": map[string]bool{"video": true, "dir": false}, "new/test/video.mp4": {"video": true, "dir": false},
}, },
}, },
} }
for _, tt := range tests { for _, tt := range tests {
t.Run(tt.search, func(t *testing.T) { t.Run(tt.search, func(t *testing.T) {
actualResult, actualTypes := index.Search(tt.search, tt.scope, "") actualResult, actualTypes := index.Search(tt.search, tt.scope, "")
assert.Equal(t, tt.expectedResult, actualResult) assert.Equal(t, tt.expectedResult, actualResult)
if !reflect.DeepEqual(tt.expectedTypes, actualTypes) { assert.Equal(t, tt.expectedTypes, actualTypes)
t.Fatalf("\n got: %+v\n want: %+v", actualTypes, tt.expectedTypes)
}
}) })
} }
} }
@ -186,6 +213,7 @@ func Test_scopedPathNameFilter(t *testing.T) {
want: "", // Update this with the expected result want: "", // Update this with the expected result
}, },
} }
for _, tt := range tests { for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) { t.Run(tt.name, func(t *testing.T) {
if got := scopedPathNameFilter(tt.args.pathName, tt.args.scope, tt.args.isDir); got != tt.want { if got := scopedPathNameFilter(tt.args.pathName, tt.args.scope, tt.args.isDir); got != tt.want {
@ -194,103 +222,3 @@ func Test_scopedPathNameFilter(t *testing.T) {
}) })
} }
} }
func Test_isDoc(t *testing.T) {
type args struct {
extension string
}
tests := []struct {
name string
args args
want bool
}{
// TODO: Add test cases.
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
if got := isDoc(tt.args.extension); got != tt.want {
t.Errorf("isDoc() = %v, want %v", got, tt.want)
}
})
}
}
func Test_getFileSize(t *testing.T) {
type args struct {
filepath string
}
tests := []struct {
name string
args args
want int64
}{
// TODO: Add test cases.
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
if got := getFileSize(tt.args.filepath); got != tt.want {
t.Errorf("getFileSize() = %v, want %v", got, tt.want)
}
})
}
}
func Test_isArchive(t *testing.T) {
type args struct {
extension string
}
tests := []struct {
name string
args args
want bool
}{
// TODO: Add test cases.
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
if got := isArchive(tt.args.extension); got != tt.want {
t.Errorf("isArchive() = %v, want %v", got, tt.want)
}
})
}
}
func Test_getLastPathComponent(t *testing.T) {
type args struct {
path string
}
tests := []struct {
name string
args args
want string
}{
// TODO: Add test cases.
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
if got := getLastPathComponent(tt.args.path); got != tt.want {
t.Errorf("getLastPathComponent() = %v, want %v", got, tt.want)
}
})
}
}
func Test_generateRandomHash(t *testing.T) {
type args struct {
length int
}
tests := []struct {
name string
args args
want string
}{
// TODO: Add test cases.
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
if got := generateRandomHash(tt.args.length); got != tt.want {
t.Errorf("generateRandomHash() = %v, want %v", got, tt.want)
}
})
}
}

View File

@ -1,7 +1,6 @@
package files package files
import ( import (
"io/fs"
"log" "log"
"time" "time"
@ -13,15 +12,10 @@ func (si *Index) UpdateFileMetadata(adjustedPath string, info FileInfo) bool {
si.mu.Lock() si.mu.Lock()
defer si.mu.Unlock() defer si.mu.Unlock()
dir, exists := si.Directories[adjustedPath] dir, exists := si.Directories[adjustedPath]
if !exists || exists && dir.Metadata == nil { if !exists {
// Initialize the Metadata map if it is nil si.Directories[adjustedPath] = FileInfo{}
if dir.Metadata == nil {
dir.Metadata = make(map[string]FileInfo)
} }
si.Directories[adjustedPath] = dir return si.SetFileMetadata(adjustedPath, dir)
// Release the read lock before calling SetFileMetadata
}
return si.SetFileMetadata(adjustedPath, info)
} }
// SetFileMetadata sets the FileInfo for the specified directory in the index. // SetFileMetadata sets the FileInfo for the specified directory in the index.
@ -32,37 +26,45 @@ func (si *Index) SetFileMetadata(adjustedPath string, info FileInfo) bool {
return false return false
} }
info.CacheTime = time.Now() info.CacheTime = time.Now()
si.Directories[adjustedPath].Metadata[adjustedPath] = info si.Directories[adjustedPath] = info
return true return true
} }
// GetMetadataInfo retrieves the FileInfo from the specified directory in the index. // GetMetadataInfo retrieves the FileInfo from the specified directory in the index.
func (si *Index) GetMetadataInfo(adjustedPath string) (FileInfo, bool) { func (si *Index) GetMetadataInfo(adjustedPath string) (FileInfo, bool) {
fi := FileInfo{}
si.mu.RLock() si.mu.RLock()
dir, exists := si.Directories[adjustedPath] dir, exists := si.Directories[adjustedPath]
si.mu.RUnlock() si.mu.RUnlock()
if exists { if !exists {
// Initialize the Metadata map if it is nil return dir, exists
if dir.Metadata == nil {
dir.Metadata = make(map[string]FileInfo)
si.SetDirectoryInfo(adjustedPath, dir)
} else {
fi = dir.Metadata[adjustedPath]
} }
// remove recursive items, we only want this directories direct files
cleanedItems := []ReducedItem{}
for _, item := range dir.Items {
cleanedItems = append(cleanedItems, ReducedItem{
Name: item.Name,
Size: item.Size,
IsDir: item.IsDir,
ModTime: item.ModTime,
Type: item.Type,
})
} }
return fi, exists dir.Items = nil
dir.ReducedItems = cleanedItems
realPath, _, _ := GetRealPath(adjustedPath)
dir.Path = realPath
return dir, exists
} }
// SetDirectoryInfo sets the directory information in the index. // SetDirectoryInfo sets the directory information in the index.
func (si *Index) SetDirectoryInfo(adjustedPath string, dir Directory) { func (si *Index) SetDirectoryInfo(adjustedPath string, dir FileInfo) {
si.mu.Lock() si.mu.Lock()
si.Directories[adjustedPath] = dir si.Directories[adjustedPath] = dir
si.mu.Unlock() si.mu.Unlock()
} }
// SetDirectoryInfo sets the directory information in the index. // SetDirectoryInfo sets the directory information in the index.
func (si *Index) GetDirectoryInfo(adjustedPath string) (Directory, bool) { func (si *Index) GetDirectoryInfo(adjustedPath string) (FileInfo, bool) {
si.mu.RLock() si.mu.RLock()
dir, exists := si.Directories[adjustedPath] dir, exists := si.Directories[adjustedPath]
si.mu.RUnlock() si.mu.RUnlock()
@ -106,7 +108,7 @@ func GetIndex(root string) *Index {
} }
newIndex := &Index{ newIndex := &Index{
Root: rootPath, Root: rootPath,
Directories: make(map[string]Directory), // Initialize the map Directories: map[string]FileInfo{},
NumDirs: 0, NumDirs: 0,
NumFiles: 0, NumFiles: 0,
inProgress: false, inProgress: false,
@ -116,36 +118,3 @@ func GetIndex(root string) *Index {
indexesMutex.Unlock() indexesMutex.Unlock()
return newIndex return newIndex
} }
func (si *Index) UpdateQuickList(files []fs.FileInfo) {
si.mu.Lock()
defer si.mu.Unlock()
si.quickList = []File{}
for _, file := range files {
newFile := File{
Name: file.Name(),
IsDir: file.IsDir(),
}
si.quickList = append(si.quickList, newFile)
}
}
func (si *Index) UpdateQuickListForTests(files []File) {
si.mu.Lock()
defer si.mu.Unlock()
si.quickList = []File{}
for _, file := range files {
newFile := File{
Name: file.Name,
IsDir: file.IsDir,
}
si.quickList = append(si.quickList, newFile)
}
}
func (si *Index) GetQuickList() []File {
si.mu.Lock()
defer si.mu.Unlock()
newQuickList := si.quickList
return newQuickList
}

View File

@ -1,92 +1,118 @@
package files package files
import ( import (
"io/fs"
"os"
"testing" "testing"
"time"
"github.com/stretchr/testify/assert"
) )
// Mock for fs.FileInfo
type mockFileInfo struct {
name string
isDir bool
}
func (m mockFileInfo) Name() string { return m.name }
func (m mockFileInfo) Size() int64 { return 0 }
func (m mockFileInfo) Mode() os.FileMode { return 0 }
func (m mockFileInfo) ModTime() time.Time { return time.Now() }
func (m mockFileInfo) IsDir() bool { return m.isDir }
func (m mockFileInfo) Sys() interface{} { return nil }
var testIndex Index var testIndex Index
// Test for GetFileMetadata // Test for GetFileMetadata// Test for GetFileMetadata
//func TestGetFileMetadata(t *testing.T) { func TestGetFileMetadataSize(t *testing.T) {
// t.Parallel() t.Parallel()
// tests := []struct { tests := []struct {
// name string name string
// adjustedPath string adjustedPath string
// fileName string expectedName string
// expectedName string expectedSize int64
// expectedExists bool }{
// }{ {
// { name: "testpath exists",
// name: "testpath exists", adjustedPath: "/testpath",
// adjustedPath: "/testpath", expectedName: "testfile.txt",
// fileName: "testfile.txt", expectedSize: 100,
// expectedName: "testfile.txt", },
// expectedExists: true, {
// }, name: "testpath exists",
// { adjustedPath: "/testpath",
// name: "testpath not exists", expectedName: "directory",
// adjustedPath: "/testpath", expectedSize: 100,
// fileName: "nonexistent.txt", },
// expectedName: "", }
// expectedExists: false, for _, tt := range tests {
// }, t.Run(tt.name, func(t *testing.T) {
// { fileInfo, _ := testIndex.GetMetadataInfo(tt.adjustedPath)
// name: "File exists in /anotherpath", // Iterate over fileInfo.Items to look for expectedName
// adjustedPath: "/anotherpath", for _, item := range fileInfo.ReducedItems {
// fileName: "afile.txt", // Assert the existence and the name
// expectedName: "afile.txt", if item.Name == tt.expectedName {
// expectedExists: true, assert.Equal(t, tt.expectedSize, item.Size)
// }, break
// { }
// name: "File does not exist in /anotherpath", }
// adjustedPath: "/anotherpath", })
// fileName: "nonexistentfile.txt", }
// expectedName: "", }
// expectedExists: false,
// }, // Test for GetFileMetadata// Test for GetFileMetadata
// { func TestGetFileMetadata(t *testing.T) {
// name: "Directory does not exist", t.Parallel()
// adjustedPath: "/nonexistentpath", tests := []struct {
// fileName: "testfile.txt", name string
// expectedName: "", adjustedPath string
// expectedExists: false, expectedName string
// }, expectedExists bool
// } }{
// {
// for _, tt := range tests { name: "testpath exists",
// t.Run(tt.name, func(t *testing.T) { adjustedPath: "/testpath",
// fileInfo, exists := testIndex.GetFileMetadata(tt.adjustedPath) expectedName: "testfile.txt",
// if exists != tt.expectedExists || fileInfo.Name != tt.expectedName { expectedExists: true,
// t.Errorf("expected %v:%v but got: %v:%v", tt.expectedName, tt.expectedExists, //fileInfo.Name, exists) },
// } {
// }) name: "testpath not exists",
// } adjustedPath: "/testpath",
//} expectedName: "nonexistent.txt",
expectedExists: false,
},
{
name: "File exists in /anotherpath",
adjustedPath: "/anotherpath",
expectedName: "afile.txt",
expectedExists: true,
},
{
name: "File does not exist in /anotherpath",
adjustedPath: "/anotherpath",
expectedName: "nonexistentfile.txt",
expectedExists: false,
},
{
name: "Directory does not exist",
adjustedPath: "/nonexistentpath",
expectedName: "",
expectedExists: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
fileInfo, _ := testIndex.GetMetadataInfo(tt.adjustedPath)
found := false
// Iterate over fileInfo.Items to look for expectedName
for _, item := range fileInfo.ReducedItems {
// Assert the existence and the name
if item.Name == tt.expectedName {
found = true
break
}
}
assert.Equal(t, tt.expectedExists, found)
})
}
}
// Test for UpdateFileMetadata // Test for UpdateFileMetadata
func TestUpdateFileMetadata(t *testing.T) { func TestUpdateFileMetadata(t *testing.T) {
index := &Index{ index := &Index{
Directories: map[string]Directory{ Directories: map[string]FileInfo{
"/testpath": { "/testpath": {
Metadata: map[string]FileInfo{ Path: "/testpath",
"testfile.txt": {Name: "testfile.txt"}, Name: "testpath",
"anotherfile.txt": {Name: "anotherfile.txt"}, IsDir: true,
ReducedItems: []ReducedItem{
{Name: "testfile.txt"},
{Name: "anotherfile.txt"},
}, },
}, },
}, },
@ -100,7 +126,7 @@ func TestUpdateFileMetadata(t *testing.T) {
} }
dir, exists := index.Directories["/testpath"] dir, exists := index.Directories["/testpath"]
if !exists || dir.Metadata["testfile.txt"].Name != "testfile.txt" { if !exists || dir.ReducedItems[0].Name != "testfile.txt" {
t.Fatalf("expected testfile.txt to be updated in the directory metadata") t.Fatalf("expected testfile.txt to be updated in the directory metadata")
} }
} }
@ -122,19 +148,29 @@ func TestGetDirMetadata(t *testing.T) {
// Test for SetDirectoryInfo // Test for SetDirectoryInfo
func TestSetDirectoryInfo(t *testing.T) { func TestSetDirectoryInfo(t *testing.T) {
index := &Index{ index := &Index{
Directories: map[string]Directory{ Directories: map[string]FileInfo{
"/testpath": { "/testpath": {
Metadata: map[string]FileInfo{ Path: "/testpath",
"testfile.txt": {Name: "testfile.txt"}, Name: "testpath",
"anotherfile.txt": {Name: "anotherfile.txt"}, IsDir: true,
Items: []*FileInfo{
{Name: "testfile.txt"},
{Name: "anotherfile.txt"},
}, },
}, },
}, },
} }
dir := Directory{Metadata: map[string]FileInfo{"testfile.txt": {Name: "testfile.txt"}}} dir := FileInfo{
Path: "/newPath",
Name: "newPath",
IsDir: true,
Items: []*FileInfo{
{Name: "testfile.txt"},
},
}
index.SetDirectoryInfo("/newPath", dir) index.SetDirectoryInfo("/newPath", dir)
storedDir, exists := index.Directories["/newPath"] storedDir, exists := index.Directories["/newPath"]
if !exists || storedDir.Metadata["testfile.txt"].Name != "testfile.txt" { if !exists || storedDir.Items[0].Name != "testfile.txt" {
t.Fatalf("expected SetDirectoryInfo to store directory info correctly") t.Fatalf("expected SetDirectoryInfo to store directory info correctly")
} }
} }
@ -143,7 +179,7 @@ func TestSetDirectoryInfo(t *testing.T) {
func TestGetDirectoryInfo(t *testing.T) { func TestGetDirectoryInfo(t *testing.T) {
t.Parallel() t.Parallel()
dir, exists := testIndex.GetDirectoryInfo("/testpath") dir, exists := testIndex.GetDirectoryInfo("/testpath")
if !exists || dir.Metadata["testfile.txt"].Name != "testfile.txt" { if !exists || dir.Items[0].Name != "testfile.txt" {
t.Fatalf("expected GetDirectoryInfo to return correct directory info") t.Fatalf("expected GetDirectoryInfo to return correct directory info")
} }
@ -156,7 +192,7 @@ func TestGetDirectoryInfo(t *testing.T) {
// Test for RemoveDirectory // Test for RemoveDirectory
func TestRemoveDirectory(t *testing.T) { func TestRemoveDirectory(t *testing.T) {
index := &Index{ index := &Index{
Directories: map[string]Directory{ Directories: map[string]FileInfo{
"/testpath": {}, "/testpath": {},
}, },
} }
@ -194,27 +230,33 @@ func TestUpdateCount(t *testing.T) {
func init() { func init() {
testIndex = Index{ testIndex = Index{
Root: "/",
NumFiles: 10, NumFiles: 10,
NumDirs: 5, NumDirs: 5,
inProgress: false, inProgress: false,
Directories: map[string]Directory{ Directories: map[string]FileInfo{
"/testpath": { "/testpath": {
Metadata: map[string]FileInfo{ Path: "/testpath",
"testfile.txt": {Name: "testfile.txt"}, Name: "testpath",
"anotherfile.txt": {Name: "anotherfile.txt"}, IsDir: true,
NumDirs: 1,
NumFiles: 2,
Items: []*FileInfo{
{Name: "testfile.txt", Size: 100},
{Name: "anotherfile.txt", Size: 100},
}, },
}, },
"/anotherpath": { "/anotherpath": {
Metadata: map[string]FileInfo{ Path: "/anotherpath",
"afile.txt": {Name: "afile.txt"}, Name: "anotherpath",
IsDir: true,
NumDirs: 1,
NumFiles: 1,
Items: []*FileInfo{
{Name: "directory", IsDir: true, Size: 100},
{Name: "afile.txt", Size: 100},
}, },
}, },
}, },
} }
files := []fs.FileInfo{
mockFileInfo{name: "file1.txt", isDir: false},
mockFileInfo{name: "dir1", isDir: true},
}
testIndex.UpdateQuickList(files)
} }

View File

@ -15,11 +15,20 @@ type modifyRequest struct {
Which []string `json:"which"` // Answer to: which fields? Which []string `json:"which"` // Answer to: which fields?
} }
var (
store *storage.Storage
server *settings.Server
fileCache FileCache
)
func SetupEnv(storage *storage.Storage, s *settings.Server, cache FileCache) {
store = storage
server = s
fileCache = cache
}
func NewHandler( func NewHandler(
imgSvc ImgService, imgSvc ImgService,
fileCache FileCache,
store *storage.Storage,
server *settings.Server,
assetsFs fs.FS, assetsFs fs.FS,
) (http.Handler, error) { ) (http.Handler, error) {
server.Clean() server.Clean()

View File

@ -11,6 +11,7 @@ import (
"github.com/gtsteffaniak/filebrowser/settings" "github.com/gtsteffaniak/filebrowser/settings"
"github.com/gtsteffaniak/filebrowser/share" "github.com/gtsteffaniak/filebrowser/share"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/storage/bolt" "github.com/gtsteffaniak/filebrowser/storage/bolt"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
) )
@ -73,8 +74,13 @@ func TestPublicShareHandlerAuthentication(t *testing.T) {
t.Errorf("failed to close db: %v", err) t.Errorf("failed to close db: %v", err)
} }
}) })
authStore, userStore, shareStore, settingsStore, err := bolt.NewStorage(db)
storage, err := bolt.NewStorage(db) storage := &storage.Storage{
Auth: authStore,
Users: userStore,
Share: shareStore,
Settings: settingsStore,
}
if err != nil { if err != nil {
t.Fatalf("failed to get storage: %v", err) t.Fatalf("failed to get storage: %v", err)
} }

View File

@ -2,7 +2,6 @@ package http
import ( import (
"encoding/json" "encoding/json"
"log"
"net/http" "net/http"
"reflect" "reflect"
"sort" "sort"
@ -14,6 +13,7 @@ import (
"github.com/gtsteffaniak/filebrowser/errors" "github.com/gtsteffaniak/filebrowser/errors"
"github.com/gtsteffaniak/filebrowser/files" "github.com/gtsteffaniak/filebrowser/files"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
) )
@ -130,21 +130,7 @@ var userPostHandler = withAdmin(func(w http.ResponseWriter, r *http.Request, d *
return http.StatusBadRequest, errors.ErrEmptyPassword return http.StatusBadRequest, errors.ErrEmptyPassword
} }
newUser := users.ApplyDefaults(*req.Data) err = storage.CreateUser(*req.Data, req.Data.Perm.Admin)
userHome, err := d.settings.MakeUserDir(req.Data.Username, req.Data.Scope, d.server.Root)
if err != nil {
log.Printf("create user: failed to mkdir user home dir: [%s]", userHome)
return http.StatusInternalServerError, err
}
newUser.Scope = userHome
log.Printf("user: %s, home dir: [%s].", req.Data.Username, userHome)
_, _, err = files.GetRealPath(d.server.Root, req.Data.Scope)
if err != nil {
log.Println("user path is not valid", req.Data.Scope)
return http.StatusBadRequest, nil
}
err = d.store.Users.Save(&newUser)
if err != nil { if err != nil {
return http.StatusInternalServerError, err return http.StatusInternalServerError, err
} }

View File

@ -34,15 +34,14 @@ func loadConfigFile(configFile string) []byte {
// Open and read the YAML file // Open and read the YAML file
yamlFile, err := os.Open(configFile) yamlFile, err := os.Open(configFile)
if err != nil { if err != nil {
log.Printf("ERROR: opening config file\n %v\n WARNING: Using default config only\n If this was a mistake, please make sure the file exists and is accessible by the filebrowser binary.\n\n", err) log.Println(err)
Config = setDefaults() os.Exit(1)
return []byte{}
} }
defer yamlFile.Close() defer yamlFile.Close()
stat, err := yamlFile.Stat() stat, err := yamlFile.Stat()
if err != nil { if err != nil {
log.Fatalf("Error getting file information: %s", err.Error()) log.Fatalf("error getting file information: %s", err.Error())
} }
yamlData := make([]byte, stat.Size()) yamlData := make([]byte, stat.Size())

View File

@ -39,3 +39,15 @@ func GenerateKey() ([]byte, error) {
func GetSettingsConfig(nameType string, Value string) string { func GetSettingsConfig(nameType string, Value string) string {
return nameType + Value return nameType + Value
} }
func AdminPerms() Permissions {
return Permissions{
Create: true,
Rename: true,
Modify: true,
Delete: true,
Share: true,
Download: true,
Admin: true,
}
}

View File

@ -28,5 +28,5 @@ func (s authBackend) Get(t string) (auth.Auther, error) {
} }
func (s authBackend) Save(a auth.Auther) error { func (s authBackend) Save(a auth.Auther) error {
return save(s.db, "auther", a) return Save(s.db, "auther", a)
} }

View File

@ -6,26 +6,14 @@ import (
"github.com/gtsteffaniak/filebrowser/auth" "github.com/gtsteffaniak/filebrowser/auth"
"github.com/gtsteffaniak/filebrowser/settings" "github.com/gtsteffaniak/filebrowser/settings"
"github.com/gtsteffaniak/filebrowser/share" "github.com/gtsteffaniak/filebrowser/share"
"github.com/gtsteffaniak/filebrowser/storage"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
) )
// NewStorage creates a storage.Storage based on Bolt DB. // NewStorage creates a storage.Storage based on Bolt DB.
func NewStorage(db *storm.DB) (*storage.Storage, error) { func NewStorage(db *storm.DB) (*auth.Storage, *users.Storage, *share.Storage, *settings.Storage, error) {
userStore := users.NewStorage(usersBackend{db: db}) userStore := users.NewStorage(usersBackend{db: db})
shareStore := share.NewStorage(shareBackend{db: db}) shareStore := share.NewStorage(shareBackend{db: db})
settingsStore := settings.NewStorage(settingsBackend{db: db}) settingsStore := settings.NewStorage(settingsBackend{db: db})
authStore := auth.NewStorage(authBackend{db: db}, userStore) authStore := auth.NewStorage(authBackend{db: db}, userStore)
return authStore, userStore, shareStore, settingsStore, nil
err := save(db, "version", 2) //nolint:gomnd
if err != nil {
return nil, err
}
return &storage.Storage{
Auth: authStore,
Users: userStore,
Share: shareStore,
Settings: settingsStore,
}, nil
} }

View File

@ -15,7 +15,7 @@ func (s settingsBackend) Get() (*settings.Settings, error) {
} }
func (s settingsBackend) Save(set *settings.Settings) error { func (s settingsBackend) Save(set *settings.Settings) error {
return save(s.db, "settings", set) return Save(s.db, "settings", set)
} }
func (s settingsBackend) GetServer() (*settings.Server, error) { func (s settingsBackend) GetServer() (*settings.Server, error) {
@ -27,5 +27,5 @@ func (s settingsBackend) GetServer() (*settings.Server, error) {
} }
func (s settingsBackend) SaveServer(server *settings.Server) error { func (s settingsBackend) SaveServer(server *settings.Server) error {
return save(s.db, "server", server) return Save(s.db, "server", server)
} }

View File

@ -15,6 +15,6 @@ func get(db *storm.DB, name string, to interface{}) error {
return err return err
} }
func save(db *storm.DB, name string, from interface{}) error { func Save(db *storm.DB, name string, from interface{}) error {
return db.Set("config", name, from) return db.Set("config", name, from)
} }

View File

@ -1,10 +1,20 @@
package storage package storage
import ( import (
"fmt"
"log"
"os"
"path/filepath"
"github.com/asdine/storm/v3"
"github.com/gtsteffaniak/filebrowser/auth" "github.com/gtsteffaniak/filebrowser/auth"
"github.com/gtsteffaniak/filebrowser/errors"
"github.com/gtsteffaniak/filebrowser/files"
"github.com/gtsteffaniak/filebrowser/settings" "github.com/gtsteffaniak/filebrowser/settings"
"github.com/gtsteffaniak/filebrowser/share" "github.com/gtsteffaniak/filebrowser/share"
"github.com/gtsteffaniak/filebrowser/storage/bolt"
"github.com/gtsteffaniak/filebrowser/users" "github.com/gtsteffaniak/filebrowser/users"
"github.com/gtsteffaniak/filebrowser/utils"
) )
// Storage is a storage powered by a Backend which makes the necessary // Storage is a storage powered by a Backend which makes the necessary
@ -15,3 +25,112 @@ type Storage struct {
Auth *auth.Storage Auth *auth.Storage
Settings *settings.Storage Settings *settings.Storage
} }
var store *Storage
func InitializeDb(path string) (*Storage, bool, error) {
exists, err := dbExists(path)
if err != nil {
panic(err)
}
db, err := storm.Open(path)
utils.CheckErr(fmt.Sprintf("storm.Open path %v", path), err)
authStore, userStore, shareStore, settingsStore, err := bolt.NewStorage(db)
if err != nil {
return nil, exists, err
}
err = bolt.Save(db, "version", 2) //nolint:gomnd
if err != nil {
return nil, exists, err
}
store = &Storage{
Auth: authStore,
Users: userStore,
Share: shareStore,
Settings: settingsStore,
}
if !exists {
quickSetup(store)
}
return store, exists, err
}
func dbExists(path string) (bool, error) {
stat, err := os.Stat(path)
if err == nil {
return stat.Size() != 0, nil
}
if os.IsNotExist(err) {
d := filepath.Dir(path)
_, err = os.Stat(d)
if os.IsNotExist(err) {
if err := os.MkdirAll(d, 0700); err != nil { //nolint:govet,gomnd
return false, err
}
return false, nil
}
}
return false, err
}
func quickSetup(store *Storage) {
settings.Config.Auth.Key = utils.GenerateKey()
if settings.Config.Auth.Method == "noauth" {
err := store.Auth.Save(&auth.NoAuth{})
utils.CheckErr("store.Auth.Save", err)
} else {
settings.Config.Auth.Method = "password"
err := store.Auth.Save(&auth.JSONAuth{})
utils.CheckErr("store.Auth.Save", err)
}
err := store.Settings.Save(&settings.Config)
utils.CheckErr("store.Settings.Save", err)
err = store.Settings.SaveServer(&settings.Config.Server)
utils.CheckErr("store.Settings.SaveServer", err)
user := users.ApplyDefaults(users.User{})
user.Username = settings.Config.Auth.AdminUsername
user.Password = settings.Config.Auth.AdminPassword
user.Perm.Admin = true
user.Scope = "./"
user.DarkMode = true
user.ViewMode = "normal"
user.LockPassword = false
user.Perm = settings.AdminPerms()
err = store.Users.Save(&user)
utils.CheckErr("store.Users.Save", err)
}
// create new user
func CreateUser(userInfo users.User, asAdmin bool) error {
// must have username or password to create
if userInfo.Username == "" || userInfo.Password == "" {
return errors.ErrInvalidRequestParams
}
newUser := users.ApplyDefaults(userInfo)
if asAdmin {
newUser.Perm = settings.AdminPerms()
}
// create new home directory
userHome, err := settings.Config.MakeUserDir(newUser.Username, newUser.Scope, settings.Config.Server.Root)
if err != nil {
log.Printf("create user: failed to mkdir user home dir: [%s]", userHome)
return err
}
newUser.Scope = userHome
log.Printf("user: %s, home dir: [%s].", newUser.Username, userHome)
_, _, err = files.GetRealPath(settings.Config.Server.Root, newUser.Scope)
if err != nil {
log.Println("user path is not valid", newUser.Scope)
return nil
}
err = store.Users.Save(&newUser)
if err != nil {
return err
}
return nil
}

19
backend/utils/main.go Normal file
View File

@ -0,0 +1,19 @@
package utils
import (
"log"
"github.com/gtsteffaniak/filebrowser/settings"
)
func CheckErr(source string, err error) {
if err != nil {
log.Fatalf("%s: %v", source, err)
}
}
func GenerateKey() []byte {
k, err := settings.GenerateKey()
CheckErr("generateKey", err)
return k
}

2
docs/contributing.md Normal file
View File

@ -0,0 +1,2 @@
# Contributing Guide

2
docs/getting_started.md Normal file
View File

@ -0,0 +1,2 @@
# Getting Started using FileBrowser Quantum

22
docs/migration.md Normal file
View File

@ -0,0 +1,22 @@
# Migration help
It is possible to use the same database as used by filebrowser/filebrowser,
but you will need to follow the following process:
1. Create a configuration file as mentioned above.
2. Copy your database file from the original filebrowser to the path of
the new one.
3. Update the configuration file to use the database (under server in
filebrowser.yml)
4. If you are using docker, update the docker-compose file or docker run
command to use the config file as described in the install section
above.
5. If you are not using docker, just make sure you run filebrowser -c
filebrowser.yml and have a valid filebrowser config.
Note: share links will not work and will need to be re-created after migration.
The filebrowser Quantum application should run with the same user and rules that
you have from the original. But keep in mind the differences that may not work
the same way, but all user configuration should be available.

24
docs/roadmap.md Normal file
View File

@ -0,0 +1,24 @@
# Planned Roadmap
upcoming 0.2.x releases:
- Replace http routes for gorilla/mux with stdlib
- Theme configuration from settings
- File syncronization improvements
- more filetype previews
next major 0.3.0 release :
- multiple sources https://github.com/filebrowser/filebrowser/issues/2514
- introduce jobs as replacement to runners.
- Add Job status to the sidebar
- index status.
- Job status from users
- upload status
Unplanned Future releases:
- Add tools to sidebar
- duplicate file detector.
- bulk rename https://github.com/filebrowser/filebrowser/issues/2473
- metrics tracker - user access, file access, download count, last login, etc
- support minio, s3, and backblaze sources https://github.com/filebrowser/filebrowser/issues/2544

View File

@ -19,18 +19,20 @@
<input <input
v-model="gallerySize" v-model="gallerySize"
type="range" type="range"
id="gallary-size" id="gallery-size"
name="gallary-size" name="gallery-size"
:value="gallerySize" :value="gallerySize"
min="0" min="0"
max="10" max="10"
@input="updateGallerySize"
@change="commitGallerySize"
/> />
</div> </div>
</div> </div>
</template> </template>
<script> <script>
import { state, mutations, getters } from "@/store"; // Import mutations as well import { state, mutations, getters } from "@/store";
import Action from "@/components/Action.vue"; import Action from "@/components/Action.vue";
export default { export default {
@ -43,12 +45,6 @@ export default {
gallerySize: state.user.gallerySize, gallerySize: state.user.gallerySize,
}; };
}, },
watch: {
gallerySize(newValue) {
this.gallerySize = parseInt(newValue, 0); // Update the user object
mutations.setGallerySize(this.gallerySize);
},
},
props: ["base", "noLink"], props: ["base", "noLink"],
computed: { computed: {
isCardView() { isCardView() {
@ -100,13 +96,16 @@ export default {
return "router-link"; return "router-link";
}, },
showShare() { showShare() {
// Ensure user properties are accessed safely return state.user?.perm && state.user?.perm.share;
if (state.route.path.startsWith("/share")) { },
return false; },
} methods: {
return state.user?.perm && state.user?.perm.share; // Access from state directly updateGallerySize(event) {
this.gallerySize = parseInt(event.target.value, 10);
},
commitGallerySize() {
mutations.setGallerySize(this.gallerySize);
}, },
}, },
methods: { },
}; };
</script> </script>

View File

@ -166,10 +166,6 @@
<b>Multiple Search terms:</b> Additional terms separated by <code>|</code>, <b>Multiple Search terms:</b> Additional terms separated by <code>|</code>,
for example <code>"test|not"</code> searches for both terms independently. for example <code>"test|not"</code> searches for both terms independently.
</p> </p>
<p>
<b>File size:</b> Searching files by size may have significantly longer search
times.
</p>
</div> </div>
<!-- List of search results --> <!-- List of search results -->
<ul v-show="results.length > 0"> <ul v-show="results.length > 0">
@ -311,6 +307,9 @@ export default {
path = path.slice(1); path = path.slice(1);
path = "./" + path.substring(path.indexOf("/") + 1); path = "./" + path.substring(path.indexOf("/") + 1);
path = path.replace(/\/+$/, "") + "/"; path = path.replace(/\/+$/, "") + "/";
if (path == "./files/") {
path = "./";
}
return path; return path;
}, },
}, },
@ -391,10 +390,10 @@ export default {
return; return;
} }
let searchTypesFull = this.searchTypes; let searchTypesFull = this.searchTypes;
if (this.largerThan != "") { if (this.largerThan != "" && !this.isTypeSelectDisabled) {
searchTypesFull = searchTypesFull + "type:largerThan=" + this.largerThan + " "; searchTypesFull = searchTypesFull + "type:largerThan=" + this.largerThan + " ";
} }
if (this.smallerThan != "") { if (this.smallerThan != "" && !this.isTypeSelectDisabled) {
searchTypesFull = searchTypesFull + "type:smallerThan=" + this.smallerThan + " "; searchTypesFull = searchTypesFull + "type:smallerThan=" + this.smallerThan + " ";
} }
let path = state.route.path; let path = state.route.path;

View File

@ -1,7 +1,7 @@
<template> <template>
<component <component
:is="isSelected || user.singleClick ? 'a' : 'div'" :is="quickNav ? 'a' : 'div'"
:href="isSelected || user.singleClick ? url : undefined" :href="quickNav ? url : undefined"
:class="{ :class="{
item: true, item: true,
activebutton: isMaximized && isSelected, activebutton: isMaximized && isSelected,
@ -16,7 +16,7 @@
:data-type="type" :data-type="type"
:aria-label="name" :aria-label="name"
:aria-selected="isSelected" :aria-selected="isSelected"
@click="isSelected || user.singleClick ? toggleClick() : itemClick($event)" @click="quickNav ? toggleClick() : itemClick($event)"
> >
<div @click="toggleClick" :class="{ activetitle: isMaximized && isSelected }"> <div @click="toggleClick" :class="{ activetitle: isMaximized && isSelected }">
<img <img
@ -34,8 +34,7 @@
<div class="text" :class="{ activecontent: isMaximized && isSelected }"> <div class="text" :class="{ activecontent: isMaximized && isSelected }">
<p class="name">{{ name }}</p> <p class="name">{{ name }}</p>
<p v-if="isDir" class="size" data-order="-1">&mdash;</p> <p class="size" :data-order="humanSize()">{{ humanSize() }}</p>
<p v-else class="size" :data-order="humanSize()">{{ humanSize() }}</p>
<p class="modified"> <p class="modified">
<time :datetime="modified">{{ humanTime() }}</time> <time :datetime="modified">{{ humanTime() }}</time>
</p> </p>
@ -93,6 +92,9 @@ export default {
"path", "path",
], ],
computed: { computed: {
quickNav() {
return state.user.singleClick && !state.multiple;
},
user() { user() {
return state.user; return state.user;
}, },
@ -263,6 +265,7 @@ export default {
action(overwrite, rename); action(overwrite, rename);
}, },
itemClick(event) { itemClick(event) {
console.log("should say something");
if (this.singleClick && !state.multiple) this.open(); if (this.singleClick && !state.multiple) this.open();
else this.click(event); else this.click(event);
}, },
@ -271,7 +274,7 @@ export default {
setTimeout(() => { setTimeout(() => {
this.touches = 0; this.touches = 0;
}, 300); }, 500);
this.touches++; this.touches++;
if (this.touches > 1) { if (this.touches > 1) {

View File

@ -9,6 +9,7 @@ export const mutations = {
setGallerySize: (value) => { setGallerySize: (value) => {
state.user.gallerySize = value state.user.gallerySize = value
emitStateChanged(); emitStateChanged();
users.update(state.user,['gallerySize']);
}, },
setActiveSettingsView: (value) => { setActiveSettingsView: (value) => {
state.activeSettingsView = value; state.activeSettingsView = value;
@ -195,19 +196,20 @@ export const mutations = {
emitStateChanged(); emitStateChanged();
}, },
setRoute: (value) => { setRoute: (value) => {
console.log("going...",value)
state.route = value; state.route = value;
emitStateChanged(); emitStateChanged();
}, },
updateListingSortConfig: ({ field, asc }) => { updateListingSortConfig: ({ field, asc }) => {
state.req.sorting.by = field; state.user.sorting.by = field;
state.req.sorting.asc = asc; state.user.sorting.asc = asc;
emitStateChanged(); emitStateChanged();
}, },
updateListingItems: () => { updateListingItems: () => {
state.req.items.sort((a, b) => { state.req.items.sort((a, b) => {
const valueA = a[state.req.sorting.by]; const valueA = a[state.user.sorting.by];
const valueB = b[state.req.sorting.by]; const valueB = b[state.user.sorting.by];
if (state.req.sorting.asc) { if (state.user.sorting.asc) {
return valueA > valueB ? 1 : -1; return valueA > valueB ? 1 : -1;
} else { } else {
return valueA < valueB ? 1 : -1; return valueA < valueB ? 1 : -1;

View File

@ -7,24 +7,24 @@ export function getHumanReadableFilesize(fileSizeBytes) {
switch (true) { switch (true) {
case fileSizeBytes < 1024: case fileSizeBytes < 1024:
break; break;
case fileSizeBytes < 1000 ** 2: // 1 KB - 1 MB case fileSizeBytes < 1024 ** 2: // 1 KB - 1 MB
size = fileSizeBytes / 1000; size = fileSizeBytes / 1024;
unit = 'KB'; unit = 'KB';
break; break;
case fileSizeBytes < 1000 ** 3: // 1 MB - 1 GB case fileSizeBytes < 1024 ** 3: // 1 MB - 1 GB
size = fileSizeBytes / (1000 ** 2); size = fileSizeBytes / (1024 ** 2);
unit = 'MB'; unit = 'MB';
break; break;
case fileSizeBytes < 1000 ** 4: // 1 GB - 1 TB case fileSizeBytes < 1024 ** 4: // 1 GB - 1 TB
size = fileSizeBytes / (1000 ** 3); size = fileSizeBytes / (1024 ** 3);
unit = 'GB'; unit = 'GB';
break; break;
case fileSizeBytes < 1000 ** 5: // 1 TB - 1 PB case fileSizeBytes < 1024 ** 5: // 1 TB - 1 PB
size = fileSizeBytes / (1000 ** 4); size = fileSizeBytes / (1024 ** 4);
unit = 'TB'; unit = 'TB';
break; break;
default: // >= 1 PB default: // >= 1 PB
size = fileSizeBytes / (1000 ** 5); size = fileSizeBytes / (1024 ** 5);
unit = 'PB'; unit = 'PB';
break; break;
} }

View File

@ -51,13 +51,13 @@ export default {
return state.selected; return state.selected;
}, },
nameSorted() { nameSorted() {
return state.req.sorting.by === "name"; return state.user.sorting.by === "name";
}, },
sizeSorted() { sizeSorted() {
return state.req.sorting.by === "size"; return state.user.sorting.by === "size";
}, },
modifiedSorted() { modifiedSorted() {
return state.req.sorting.by === "modified"; return state.user.sorting.by === "modified";
}, },
ascOrdered() { ascOrdered() {
return state.req.sorting.asc; return state.req.sorting.asc;
@ -297,7 +297,7 @@ export default {
const currentIndex = this.viewModes.indexOf(state.user.viewMode); const currentIndex = this.viewModes.indexOf(state.user.viewMode);
const nextIndex = (currentIndex + 1) % this.viewModes.length; const nextIndex = (currentIndex + 1) % this.viewModes.length;
const newView = this.viewModes[nextIndex]; const newView = this.viewModes[nextIndex];
mutations.updateCurrentUser({ "viewMode": newView }); mutations.updateCurrentUser({ viewMode: newView });
}, },
preventDefault(event) { preventDefault(event) {
// Wrapper around prevent default. // Wrapper around prevent default.

View File

@ -207,16 +207,16 @@ export default {
return state.multiple; return state.multiple;
}, },
nameSorted() { nameSorted() {
return state.req.sorting.by === "name"; return state.user.sorting.by === "name";
}, },
sizeSorted() { sizeSorted() {
return state.req.sorting.by === "size"; return state.user.sorting.by === "size";
}, },
modifiedSorted() { modifiedSorted() {
return state.req.sorting.by === "modified"; return state.user.sorting.by === "modified";
}, },
ascOrdered() { ascOrdered() {
return state.req.sorting.asc; return state.user.sorting.asc;
}, },
items() { items() {
return getters.reqItems(); return getters.reqItems();
@ -443,7 +443,7 @@ export default {
return; return;
} }
if (noModifierKeys && getters.currentPromptName() != null) { if (noModifierKeys && getters.currentPromptName() != null) {
return return;
} }
// Handle the space bar key // Handle the space bar key
if (key === " ") { if (key === " ") {

View File

@ -1,27 +0,0 @@
# Planned Roadmap
next 0.2.x release:
- Theme configuration from settings
- File syncronization improvements
- right-click context menu
initial 0.3.0 release :
- database changes
- introduce jobs as replacement to runners.
- Add Job status to the sidebar
- index status.
- Job status from users
- upload status
Future releases:
- Replace http routes for gorilla/mux with pocketbase
- Allow multiple volumes to show up in the same filebrowser container. https://github.com/filebrowser/filebrowser/issues/2514
- enable/disable indexing for certain mounts
- Add tools to sidebar
- duplicate file detector.
- bulk rename https://github.com/filebrowser/filebrowser/issues/2473
- job manager - folder sync, copy, lifecycle operations
- metrics tracker - user access, file access, download count, last login, etc
- support minio s3 and backblaze sources https://github.com/filebrowser/filebrowser/issues/2544